Press "Enter" to skip to content

Category: Cloud

Automatically Refreshing a Power BI Semantic Model after Dataflow Loads

Reza Rad refreshes a model:

Although this seems to be a simple thing to do, it is not a function that you can turn on or off. If you have a Dataflow that does the ETL and transforms and prepares the data, then to get the most up-to-date data into the report, you will need to refresh the Power BI semantic model after that, only upon successful refresh of both dataflow and semantic model is when you will have the up-to-date data into the report. Fortunately, in Fabric, this is a straightforward setup. In this article and video, I’ll explain how this is possible.

Click through for the video and the blog post. Granted, this feature is in preview, but using it is pretty straightforward.

Comments closed

Thoughts on Cloud Monitoring Solutions

Mika Sutinen takes a look at built-in ways to monitor SQL Server databases in the three major cloud providers:

Monitoring SQL Server databases is one of the main responsibilities of DBAs, both in on-premises and on the public cloud platforms. Continuing from my previous post, where I reviewed the various options of SQL Server PaaS offerings from major hyperscalers, I’ll now focus on the monitoring solutions they provide.

While I always prefer commercial tools that come packed with features, it’s not the only way to go. Whether you’re using AWS, Azure, or GCP, each platform offers unique tools and features to help you keep an eye on your managed SQL Server database services.

Read on for information about what’s available in each.

Comments closed

Three Incremental Load Patterns with Azure Data Factory

Temidayo Omoniyi likes a good pattern:

This article is divided into three major sections—each showing the different abilities and use cases of performing incremental load with Azure Data Factory. This process can also be done in an Azure Synapse Pipeline and Fabric Pipeline.

The document contains the following:

Section 1: Copy Data Based on Last Modified Date or Latest File

Section 2: Incremental Copy Using Dataflow

Section 3: Incremental Copy Using Lookup and Stored Procedure Activities

Click through for each of these three patterns, with plenty of screenshots and step-by-step instructions.

Comments closed

Streaming Data to Azure Event Hub via Mockaroo and Kafka API

Jasleen Kaur Wahi generates some data:

In a recent project, I faced the need to generate randomized data for transmission to the Azure Event Hub. This hub is a key component of Microsoft Azure, used for real-time data ingestion and processing.

First, let’s take look at how I created this random data. I wanted to come up with a way to make data that looks like what we see in the real world, but without using any real information from users. This made-up data was really important for a bunch of things, like checking if our software works well.

Read on to see how Mockaroo works and the end result. Creating tests for streaming services like Event Hubs is a challenge, so this is an interesting approach to the task.

Comments closed

Point-in-Time Restoration for Azure SQL Managed Instances

Andy Brownsword points and clicks:

One of the benefits which comes with a Managed Instance is having backups taken care of for you. That also includes restores. Particularly useful is a one-click (ish) restore for a specific point in time.

Restoring a database is as easy as creating a new database, as its part of the same workflow.

Read on to see how it works, as well as one limitation around existing databases.

Comments closed

What’s Ahead for SQL Server in 2025

Bob Ward lays out the plan:

As we begin a new year in 2025, many of you are looking at new projects, new applications, trying to determine how to integrate AI into your business, modernizing your data estate, or considering an upgrade or a cloud migration. As you consider your options, let’s look at the state of the union in 2025 of Microsoft new releases and capabilities for SQL Server, Azure SQL, SQL database in Fabric, Copilots, tools, and developer experiences.

There’s a lot on the list, so check it out.

Comments closed

Creating a Microsoft Fabric Warehouse with Service Principal

Gilbert Quevauvilliers sets up a new warehouse:

In this blog post I am going to show you how to create a Microsoft Fabric Warehouse, where the owner will be the Service Principal.

As mentioned in the blog post here are some of the advantages of having the Service Principal as the Warehouse Owner.

  • Using a Service Principal to create the warehouse avoids issue where the person who created the warehouse leaves the organization and issues arise when the users account is deleted from Entra ID.
  • You avoid the painful logging in with the user account to ensure the password remains updated.
  • The organization now owns the warehouse and not an individual user.

I will show you how I created a Warehouse with the owner being a Service Principal this using a Microsoft Fabric Notebook

Click through for the notebook and additional commentary.

Comments closed

Data Masking in Azure Databricks

Rayis Imayev hides some information:

One way to protect sensitive information from end users in a database is through dynamic masking. In this process, the actual data is not altered; however, when the data is exposed or queried, the results are returned with modified values, or the actual values are replaced with special characters or notes indicating that the requested data is hidden for protection purposes.

In this blog, we will discuss a different approach to protecting data, where personally identifiable information (PII – a term you will frequently encounter when reading about data protection and data governance) is actually changed or updated in the database / persistent storage. This ensures that even if someone gains access to the data, nothing will be compromised. This is usually needed for refreshing the production database or dataset containing PII data elements to a lower environment. Your QA team will appreciate having a realistic data volume that resembles production environment but with masked data.

Rayis goes into depth on the process. I could also recommend checking out the article on row filters and column masks for more information.

Comments closed