Press "Enter" to skip to content

Category: Cloud

Choosing a SKU for Azure Data Explorer

Brian Bønk makes a choice:

When creating the clusters from the Azure portal, you are presented with 3 options when choosing the compute specification.

The compute specification is the method of setting up the clusters for the specific workload you are planning to put on the Kusto cluster.

The portal gives you these three options:

Read on for the options, as well as some recommendations on when you might choose each.

Comments closed

Restoring an Azure SQL Database

Andrea Allred recovers from a mistake:

Recently, the wrong table got dropped and we needed to bring it back. I had never done a restore in an Azure Managed Database before so I learned something really fast.

Click through for the process. And yeah, it is quite easy, though I’ve noticed that restore times are a bit slower than if you were using local hardware on-premises.

One neat trick with database restores in Azure SQL DB: you can’t restore over an existing database, something a client wanted me to do last week. What you can do, however, is restore the database under a new name, so we might have messedupdb and then messedupdb_restore. Well, in this case, messedupdb had no changes since “the incident,” so what we were able to do was rename messedupdb to messedupdb_dropme and rename messedupdb_restore to messedupdb. Azure SQL DB happily rolls on with this and after ensuring that the database was now in prime condition, we could drop the old version. It’s a little more complex than simply restoring over the existing database, but all the relevant metadata Azure SQL DB needs stayed in sync along the way, so the process was smooth.

Comments closed

.NET Framework Versions and ADO Pipeline Builds

Olivier Van Steenlandt runs into a versioning issue:

The error message I received during the build process in my Azure DevOps YAML Pipeline was :

##[error]C:\Program Files\Microsoft Visual Studio\2022\Enterprise\MSBuild\Current\Bin\Microsoft.Common.CurrentVersion.targets(1229,5): Error MSB3644: The reference assemblies for .NETFramework,Version=v4.5 were not found. To resolve this, install the Developer Pack (SDK/Targeting Pack) for this framework version or retarget your application. You can download .NET Framework Developer Packs at https://aka.ms/msbuild/developerpacks

I wasn’t sure how to solve this issue, and when I was using my on-premise Agent Pool, the Database Project was able to build successfully.

Click through for the solution

Comments closed

Managed Identities and Invoking REST Endpoints from Azure SQL DB

Imke Feldmann executes a Power BI REST endpoint call from Azure SQL Database:

For Azure SQL Databases there is a very cool new preview feature: “sp_invoke_external_rest_endpoint “. This function allows you to call certain Microsoft API endpoints directly from within your Azure database and write that data back into a table for example.

With that, you can for example create a stored procedure that can be triggered from Power Automate. This is ideal for larger datasets that would require long and slow “apply-to-each” rounds or cumbersome bulk-upload-workarounds.

I was struggling with the authentication when using a system assigned managed identity (“service principal”). Thanks to Davide Mauri for telling me how to fill in the parameters for the DATABASE SCOPED CREDENTIALS to make this work for Power BI:

Click through to see that answer, as well as a demonstration of the entire process.

Comments closed

Landing Zone Layouts for Modern Data Warehouses

Paul Hernandez builds out a landing zone for a warehouse:

In this article I want to discuss some different layout options for a landing zone in a modern cloud data warehouse architecture. With landing zone, I mean a storage account where raw data lands directly from its source system (not to be confused with a landing zone to move a system or application into the cloud).

One of the things I appreciate a lot about this post is that it covers the history, showing us how we got to where we are. Paul’s well-versed in each step along the way and lays things out clearly.

Comments closed

Model Deployment using Azure Functions

Alexander Billington needs to get that new model out:

Deploying machine learning (ML) models into production can be challenging, as it requires careful consideration of various factors such as scalability, reliability, and maintainability. While developing an ML model is an exciting process, deploying it into production can be a daunting task. The challenges faced in productionising data science projects can range from infrastructure to version control, model monitoring to integration with other systems. This blog will take a look at how Azure Functions can simplify the deployment process, getting models into production quickly and robustly to maximise their value.

I like this approach and find it interesting, as most of the time, the MLOps model Microsoft recommends has you scheduling Azure DevOps pipelines / GitHub Actions periodically or when new training data hits a specific folder. If you have some non-standard trigger for an action, this is a good way to get you going.

Comments closed

Refreshing a Power BI Dataset via HTTPS URL

Gilbert Quevauvilliers presses the big red button:

I have found that sometimes there are other systems that are loading data, and once they are complete they then want to refresh the Power BI Dataset.

Another way to do this is to use Power Automate, in which a system or user can request a HTTPS URL and once called that will then refresh the Power BI dataset.

I explain how to do this in the steps below.

Click through to see how to set up that job.

Comments closed

Data Validation with Great Expectations and Azure Functions

Eduard van Valkenburg does a bit of data validation:

Great Expectations (Great Expectations Home Page • Great Expectations) is a popular Python-based OSS tool to validate data that comes into your data estate. And for me, validating incoming data is best done file by file, as the files arrive! On Azure there is no better platform for that then Azure Functions. I love the serverless nature of Functions and with the triggers available for arriving blobs, as well as HTTP, event grid events, queues and others. There are some great patterns that allow you to build event-driven data architectures. We also now have the Python v2 framework for Azure Functions available, which makes the developer experience even better. So, let’s go through how to get it running.

This looks really interesting and tying it in to Azure Functions is a good idea assuming that the checks don’t run for too long.

Comments closed

Contrasting Azure IoT Hub and Event Hub

Brian Bønk lays out a quick comparison:

When working with Azure Data Explorer and loading data to the storage engine, you might have some streaming devices or services that should land in the engine.

Azure provides two out-of-the-box services:

  1. Azure IoT Hub
  2. Azure Event Hub

At first glance it seems like teh two services are doing the exact same thing – sending events through to other services in Azure. But there are some differences.

Read on to see what these differences are.

Comments closed

Best Practices Assessment for Azure Arc-Enabled SQL Server Instances

Ganapathi Varma Chekuri takes us through an assessment:

Best practices assessment provides a mechanism to evaluate the configuration of your SQL Server. Once the best practices assessment feature is enabled, your SQL Server instance and databases are scanned to provide recommendations for things like SQL Server and database configurations, index management, deprecated features, enabled or missing trace flags, statistics, etc. Assessment run time depends on your environment (number of databases, objects, and so on), with a duration from a few minutes, up to an hour.

If you’re familiar with the assessment on Azure VMs, this is quite similar, though it extends to on-premises machines or VMs running in other cloud providers. This does require installing the agent and paying for an Arc-Enabled SQL Server instance, so it’s not free.

Comments closed