Press "Enter" to skip to content

Category: Microsoft Fabric

Execute a Collection of Child Pipelines from Metadata in Data Factory

Andy Leonard continues a series on design patterns:

In this post, I clone and modify the dynamic parent pipeline from the previous post to retrieve metadata from an Azure SQL database table for several child pipelines, and then call each child pipeline from a parent pipeline.

When we’re done, this pipeline will:

  1. Read pipeline metadata from a table in an Azure SQL database
  2. Store some of the metadata (a collection of pipelineID values) in the (existing) pipelineIdArray variable
  3. Iterate the pipelineIdArray variable’s collection of pipelineID values
  4. Execute each child pipeline represented by each pipelineID value stored in the pipelineIdArray variable

Read on to learn how.

Comments closed

Analyzing Delta Table Measures in Microsoft Fabric

Sandeep Pawar has a script for us:

I have been sitting on this code for a long time. I shared the first version in one of my blogs on Direct Lake last year. I have been making updates to it since then as needed. I waited for the lakehouse schema to become available and then forgot to blog about it. Yesterday, someone reached out asking if the above could be used for warehouse delta tables in Fabric, so here you go. It’s 250+ lines so let me just explain what’s going on here:

Read on for the explanation, the script itself, a demonstration, and several additional notes.

Comments closed

Reviewing Kusto Query History in Microsoft Fabric

Dennes Torres looks over prior commands:

We can consume a Kusto database in Fabric from many different places: Notebooks, semantic models, real time dashboards and more. Kusto register all queries sent by the consumers in the query history.

Sometimes, either for logging purpose or to analyze and fix some bug, we need to identify the queries the database is receiving and executing.

Read on to see what you can do with query history in Kusto.

Comments closed

Charting Microsoft Fabric Workspace Activity

Sandeep Pawar creates a chart:

Semantic Link Labs v0.8.3 has list_activities method to get the list of all activities in your Fabric tenant. It uses the same Power BI Admin - Get Activity Events API but this API now also includes Fabric activities. Note that this is an Admin API so you need to be a Fabric administrator. Check the API details.

To answer the above question, I will use the admin.list_activity_events , loop over the last 30 days and plot the results by Fabric item type in my personal tenant:

Click through for that code. Though if you’re going to do something similar in your environment, I recommend not using a line chart for this visual, as line charts indicate a flow over time and this is definitely point-in-time categorical data. A bar chart or dot plot would be better in that case.

Comments closed

Reasons to Migrate from Synapse to Fabric

James Serra has a list:

Many customers ask me about the advantages of moving from Azure Synapse Analytics to Microsoft Fabric. Here’s a breakdown of the standout features that make Fabric an appealing choice:

  • Unified Environment for All Users
    Fabric serves everyone—from report writers and citizen developers to IT engineers—unlike Synapse, which primarily targets IT professionals.
  • Hands-Free Optimization
    Fabric is auto-optimized and fully integrated, allowing most features to perform well without requiring technical adjustments.

I suppose that James is too politic to give what I’d consider the top reason: because there have actually been meaningful updates to Microsoft Fabric in the past year. I’m not sure you can really say the same thing about Azure Synapse Analytics.

The tricky part about this, however, is that–to my knowledge, at least–there’s no clean way to migrate dedicated SQL pools.

Comments closed

Move Data between Lakehouses and Workspaces in Microsoft Fabric

Gilbert Quevauvilliers performs an exfiltration:

With the new Schema’s in a Lakehouse, it now is possible to read from Lakehouse A (In Workspace A) and write to Lakehouse B (In Workspace B).

Here are more details about the Schema preview: Lakehouse schemas (Preview) – Microsoft Fabric | Microsoft Learn

This opens a whole new world of possibilities.

I also really like the fact that I can simply use the Names, and I do not need to get the actual GUIDS!

For example, I can use the following as shown below which is WorkspaceName.LakehouseName,SchemaName.TableName

Click through to see it in action.

Comments closed

Fabric List Connections API in Semantic Link Labs

Sandeep Pawar has an update for us:

In you case you missed it, List Connections Admin API is now live in Fabric. It was shipped in Semantic Link Labs v 0.7.4 a few weeks ago but at the time of the release it was still private. This API returns all the connections set up in the tenant and requires admin privileges. I still can’t find documentation on it so wait for the official details. Note that this API is different from item – list connection API which lists connections used by an item.

Read on to see what you can get from it.

Comments closed

Notes on Bursting and Monitoring in Microsoft Fabric

Joey D’Antoni puts in a lot of work and then coasts for a few hours:

Bursting and smoothing is a concern for a lot of Fabric admins—it’s somewhat unpredictable in terms of how large the initial burst is, and how long the smoothing process takes. One problem that a number of both MVPs and customers have observed, is Fabric effectively being unavailable for things like dashboard refreshes for long periods of time. Microsoft describes this process in docs as the following:

“When a scale factor is over 1, it means that burstable capacity is being used to meet the demands of the workload. It also means that your workload is borrowing capacity units from a future time interval. This is a fundamental concept of Microsoft Fabric called smoothing.

Smoothing offers relief for customers who create sudden spikes during their peak times, while they have a lot of idle capacity that is unused. Smoothing simplifies capacity management by spreading the evaluation of compute to ensure that customer jobs run smoothly and efficiently.”

Read on to learn more about this process and how it can be tricky for administrators to manage.

Comments closed

Lexing DAX with PyDAX

Sandeep Pawar reviews a DAX lexer:

The power of open-source and GenAI. Klaus Jürgen Folz recently open-sourced the PyDAX library, which parses DAX expressions to extract or remove comments, and identify referenced columns and measures. I used that library to create some demos for myself and then shared the notebook along with instructions with Replit agents to build an app for me.. 15 minutes & 3 prompts later I had a fully functional app. Give it a try : https://daxparser.replit.app/

Read on to learn more, including why I referred to PyDAX as a “lexer” and a few more notes of relevance.

Comments closed