Press "Enter" to skip to content

Category: Microsoft Fabric

Setting a Default Destination for Fabric Dataflows Gen2

Jon Voge wants to spend less time copying and pasting:

Ever had a Dataflow Gen2 in which you needed to map the output of several queries to the same Warehouse or Lakehouse? Takes a while to setup, right?

If you wish to add a Default Destination to your Dataflow, all you need to do is to create the Dataflow from inside your desired destination. This works for both Warehouses, Lakehouses and KQL Databases:

Click through for an example of how it works.

Leave a Comment

SQL Database in Microsoft Fabric

Nikola Ilic covers a new addition to the Microsoft Fabric family:

Now, let’s get back to the previous point: SQL database in Fabric is a SaaS Azure SQL DB…Generally speaking, in SaaS solutions, “everything just works” (or at least should work) – without (too much) intervention from your side.

In the context of the SQL database in Fabric, creating a database is probably the most straightforward process of database creation you will ever experience, as I’ll show you in the “HOW TO” section of this article. From that point, everything happens automatically: the database will be automatically configured and will automatically scale both in terms of compute resources and storage. In addition, database backups are performed automatically, indexing also happens the same way, as well as all patches and software/hardware fixes. You want more? No more complex firewall rules and permission settings – this time, everything is done via Fabric workspace roles and item permissions, while the well-known SQL native features allow for more granular control.

This is more of a head-scratcher for me than a brilliant solution. I get that there’s a challenge in figuring out what you want with Azure SQL Database: single database or elastic pool, serverless or provisioned, vCore or DTU-based pricing model, General Purpose or Hyperscale or Business Critical (for vCore), Basic or Standard or Premium (for DTU), one of about five separate hardware configurations, etc.

From the standpoint of “I just want a database, please,” Fabric SQL Database is a lot easier. The problem comes in when you hit the use cases that necessitated all of these options to begin with, at which point you’re back to the original creation screen and outside of Fabric once more.

Leave a Comment

Querying a Fabric KQL Database via REST API

Sandeep Pawar grabs some data:

I have previously explained how to query a KQL database in a notebook using the Kusto Spark connector, Kusto Python SDK, and KQLMagic. Now, let’s explore another method using the REST API. Although this is covered in the ADX documentation, it isn’t in Fabric (with example), so I wanted to write a quick blog to show how you can query a table from an Eventhouse using a REST API.

Click through to see how you can do it. Sandeep’s code is in Python but because this is just hitting a REST API rather than using a library, you could also use some tool like Postman.

Leave a Comment

Tracking Column Sizes on DAX Queries

Chris Webb busts out the calculator:

I had meant to follow up my recent post on how to find the columns touched by a DAX query by writing one on how to use this technique to find the size of these columns in memory, so you can find the total size of the columns that need to be paged into memory when a DAX query runs on a Direct Lake semantic model. Before I could do that, though, my colleague Michael Kovalsky messaged me to say that not only had he taken the query from that first post and incorporated it in Semantic Link Labs, he’d done the work to get column sizes too. All that’s left for me to do, then, is give you some simple examples of how to use it.

Click through for those simple examples, though note that this requires Microsoft Fabric.

Leave a Comment

Missing Data in Microsoft Fabric Real-Time Intelligence Workloads

Greg Low covers a common scenario:

But another area that I see very few handling well, is the data that is missing, rather than just the data that is present. There’s a huge difference between data that arrived, and is odd, and data that just didn’t arrive at all.

One tool that’s great at working with streams of data is the Real Time Intelligence workload for Microsoft Fabric. And it’s also great at working with data that is missing from those streams.

Greg covers some of the scenarios around missing data, though not a lot on the process to fix them.

Leave a Comment

Tenant Switching in Microsoft Fabric

Koen Verbeeck has good news:

Praise whatever deity you believe in, because it’s finally here, a tenant switcher for Microsoft Fabric (which includes Power BI). A what? Let me explain. When you have a organization with multiple tenants in Azure (also called directories in some products like Azure Devops), or you’re a consultant like me who works with multiple clients (with each their own tenants), it’s possible that you can log into multiple tenants using the same email address. This can happen if your user account was added as an external user to another tenant.

This has been a real pain, and unfortunately, that pain still exists for Power BI Desktop.

Leave a Comment

Cross-Workspace KQL Queries in Microsoft Fabric

Sandeep Pawar drinks your milkshake:

In Fabric, if you want to query a delta table from a lakehouse in another workspace, you create a shortcut to that table. Similarly, in Eventhouse, you can also create shortcuts to Eventhouses in other workspaces, but the option might not be immediately obvious in the GUI. If you click on New > OneLake shortcut, it creates a shortcut to a delta table, not an Eventhouse.

Click through to see how to do this via UI and programmatically.

Leave a Comment

SCD Types in Microsoft Fabric

Kenneth Omorodion reminds us that the Kimball model is still quite valuable:

In modern data warehousing, how we handle updates to dimension tables is crucial. There are several approaches; but the decision often comes down to two primary strategies: Slowly Changing Dimensions (SCD) Type 2 and overwriting tables. Each has its own benefits, use cases, and trade-offs. This tip will explore the two methods and why SCD Type 2 is often a better option in many data warehouse scenarios.

Read on for this overview of the benefits of type-2 slowly changing dimensions, as well as a little bit of coverage of several other types of slowly changing dimensions.

Leave a Comment

Create a Case Insensitive Warehouse in Microsoft Fabric

Gilbert Quevauvilliers is speaking my language:

This is a quick blog post to show you how to use a Microsoft Fabric Notebook to quickly and easily create a Case Insensitive Warehouse.

Just a quick note when I talk about a Case Insensitive Warehouse, what that means is that the upper casing and lower casing of column names and text are ignored. By default, Warehouses and Lakehouse’s are case sensitive in Microsoft Fabric.

Case sensitivity is a trap, so I applaud Gilbert’s commitment to excellence here.

Leave a Comment

The Challenge of Importing Items into a Fabric Workspace

Marc Lelijveld performs an airing of grievances:

Obviously, you don’t want to start every solution from scratch. Therefore, it might be beneficial to kick-start your new solution by just importing components you already developed at earlier stages. Recently, I wanted to import a notebook to a Fabric workspace but was a bit confused. In this blog, I will further elaborate on the confusion and show how, in the end, I successfully imported the notebook to the workspace.

Read on for a story of pain.

Leave a Comment