Press "Enter" to skip to content

Category: Cloud

Running Dask on AKS

Tsuyoshi Matsuzaki sets up Dask as a distributed service:

In my last post, I showed you tutorial for running Apache Spark on managed kubernetes, Azure Kubernetes Service (AKS).
In this post, I’ll show you the tutorial for running distributed workloads of Dask on AKS.

By using Dask, you can run Scikit-Learn compliant functions and jobs for data which cannot fit in memory, or run in distributed manners. For simplicity, here I’ll use built-in Dask ML function (dask_ml.linear_model.LinearRegression) in this tutorial. (With the same manners, you can also run regular sklearn functions.)
Cloud managed kubernetes will make you speed up this large ML workloads.

Click through for the process. I’ve had some positive experiences with Dask as a dashboarding tool. It’s definitely one of the better ones if you’re big into Python.

Comments closed

To Cloud or Not to Cloud

That is the question, according to Guy Glantser:

This is not a regular blog post. I was looking for an old blog post that I wrote several years ago, and while searching, I found an even older blog post that I wrote back in 2009. It had the same title that you see here – To Cloud or Not to Cloud?

In 2009 the cloud was already a thing, but it was the early days. Microsoft’s cloud, Azure, wasn’t even announced yet until February 2010. The cloud has seen a tremendous advancement over the years. It’s interesting and also amusing to read what I wrote 12 years ago about the cloud. Some things are still true today, while others are completely irrelevant.

So here it is…

It’s good to reflect back on these thoughts to see how the industry has shifted. Issues which were show-stoppers may be completely eradicated by this point, while others remain trade-offs without an ideal answer.

Comments closed

From Azure Analysis Services to Power BI Premium Per User

Gilbert Quevauvilliers picks back up on a new series:

Welcome to the first in my blog post series on evaluating the different aspects when looking to migrate from Azure Analysis Services (AAS) to Power BI Premium Per User (PPU).

Apologies for this taking a few extra weeks to get started, life has been super busy, but as they say “Better late than never”.

In this post I am going to compare the Query Performance of an AAS Cube compared to a PPU Cube.

Click through to see how Power BI Premium Per User stacks up against Azure Analysis Services.

Comments closed

Securing Amazon Managed Streaming for Kafka

Stephane Maarek has some security advice for us:

AWS launched IAM Access Control for Amazon MSK, which is a security option offered at no additional cost that simplifies cluster authentication and Apache Kafka API authorization using AWS Identity and Access Management (IAM) roles or user policies to control access. This eliminates the need for administrators to run an unfamiliar system to control access to Apache Kafka on Amazon MSK, and learn intricate details and specific commands to manage Apache Kafka access control lists (ACLs).

This is a game-changer from a security perspective for AWS customers who use Apache Kafka: I recommend Amazon MSK customers use IAM Access Control unless they have a specific need for using mutual TLS or SASL/SCRAM authN/Z.

Read on to see how it works.

Comments closed

Using Logic Apps to Send Multiple Attachments

Rayis Imayev has a project:

In my real project, I need to build a Logic App to send email messages with a set of files attached from my Azure Storage Account. I was able to find similar examples from other power platform developers, however, they lacked a critical part that I needed: my set of files had to be dynamic: 2 files, or 102 files –  the Logic App should be able to support this.

So, here, I would like to share my brief journey in creating such Azure Logic App:

Read on to see how Rayis solved this.

Comments closed

Cost Management Updates in Azure

Michael Flanakin gives us a few updates on Azure billing:

Understanding your cost patterns over time and investigating specific charges often requires drilling into and selecting specific dates. You’ve always been able to select from one day up to one year in cost analysis, but you’ve told us that selecting those dates isn’t as easy as it could be. As we started building out a new platform for analytics and insights, we took this feedback to heart and completely redesigned the date selection. What you see today is an early peek at that.

This month, you’ll find a new option to select a custom date range in the cost analysis preview. You can pick a single month, a range of months, or start and end dates for a range of days, making it easier than ever to fine-tune your reporting to the dates you need. 

The virtue and downfall of cloud systems like AWS and Azure is that they’re very clear about how much things cost, but only if you know exactly the resources something uses. It’s not as simple as “I want to use a database,” but there are all of those other charges around data egress, networking, log management, etc. which can add up. Many of those costs are negligible (fortunately), but try walking through a pricing scenario for Azure Synapse Analytics sometime with someone new to the product and figure out at what point that person gives up trying to calculate the cost. My money says right around the time you get to the integration runtime costs.

Comments closed

Managing Azure DevOps via Azure Logic Apps

Stuart Ainsworth has a process:

A big part of my job these days is looking for opportunities to improve workflow. Automation of software is great, but identifying areas to speed up human processes can be incredibly beneficial to value delivery to customers. Here’s the situation I recently figured out how to do:

1. My SRE team uses a different Azure DevOps project than our development team. This protects the “separation of duties” concept that auditors love, while still letting us transfer items back and forth.
2. The two projects are in the same organization.
3. The two projects use different templates, with different required fields.
4. Our workflow process requires two phases of triage for bugs in the wild: a technical phase (provided by my team), and a business prioritization (provided by our Business Analyst).
5. Moving a card between projects is simple, but there were several manual changes that had to be made:
– Assigning to a Business Analyst (BA)
– Changing the status to Proposed from Active
– Changing the Iteration and Area
– Moving the card.

To automate this, I decided to use Azure Logic Apps

Read on to see how Stuart did this.

Comments closed

Storing dbatools as a Package in Azure DevOps

Kevin Chant has a process for us:

In this post I want to cover how you can store dbatools PowerShell module as a package in Azure DevOps. By using the Azure Artifacts service.

I want share some knowledge about this because did a demo of it at Malta Data Saturday. By the end of this post you will have a better understanding of Azure Artifacts and a workaround if you encounter a problem publishing a package.

Read on for the process.

Comments closed