Press "Enter" to skip to content

Category: Microsoft Fabric

Deploying Assets via Azure DevOps and fabric-cicd into Microsoft Fabric

Kevin Chant pushes some code:

In this post I want to show how you can operationalize fabric-cicd to work with Microsoft Fabric and Azure DevOps. Which I exclusively revealed at Power BI Gebruikersdag over the weekend.

Just so that everybody is aware, fabric-cicd is a Python library that allows you to perform CI/CD of various Microsoft Fabric items into Microsoft Fabric workspaces. At this moment in time there is a limited number of supported item types. However, that list is increasing.

Click through to see what Kevin did and how it worked out.

Leave a Comment

Writing Data into a Microsoft Fabric Lakehouse via Notebook

Stepan Resl writes some code:

Since Lakehouse is one of the key items within Microsoft Fabric, it is important to know how to write data into it in various formats and using different tools. One of the most common tools is notebooks, as they provide great flexibility and speed for development and testing with graphical outputs. In this article, I want to focus primarily on the following types of notebooks:

  • PySpark
  • Python

Click through to see how it works in both notebook types.

Leave a Comment

Retrieving Microsoft Fabric Items using a Python-Only Notebook

Gilbert Quevauvilliers doesn’t need Spark for this:

This blog below explains how to use a Python only notebook to get all the Fabric items using the Fabric REST API.

NOTE: At the time of this blog post Feb 2025, Dataflow Gen2 is not included in the Fabric items, I am sure it will be there in the future.

NOTE II: This only gets the Fabric Items, which does not include the Power BI Items.

Despite the notes, Gilbert leads off with the main reason why you might want to use this: it takes up approximately 5% of the capacity units that a Spark-based notebook does to perform the same operation.

Leave a Comment

Refreshing Power BI Semantic Model Hidden Tables via Fabric Data Pipelines

Chris Webb digs into the dark underbelly of a semantic model:

Following on from my recent post about refreshing semantic models with Fabric Data Pipelines and the semantic model refresh activity, a few people asked me how to refresh hidden tables because they are not displayed in the Pipeline configuration UI. I got the answer from my colleague Alex Powers (aka reddit celebrity u/itsnotaboutthecell) who kindly allowed me to blog about it.

Click through for the demonstration.

Leave a Comment

ETL Orchestration and Air Traffic Control

Jens Vestergaard extends a metaphor:

We have been working getting an enterprise grade event driven orchestration of our ETL system to operate like an airport control tower, managing a fleet of flights (data processes) as they progress through various stages of take-off, transit, and landing. All of this, because Microsoft Fabric has a core-based limit to the number of Notebook executions that a capacity can execute and have queued up in line for execution when invoking them using the REST API. Read the details here: limits (you know, it’s funny that there is no stated limits for Azure Service Bus Queues on number of messages in queue, but there is for Microsoft Fabric, which uses a Service Bus queue underneath…)

That limitation is a bit annoying, but read on to see how Jens uses this metaphor to explain the various parts of an ETL orchestration engine.

Leave a Comment

Publishing a Fabric SQL Database

Koen Verbeeck deploys a database:

When a SQL Database is in Microsoft Fabric, you can develop it locally in a database project. As part of the development process, you want to deploy this project to the online Fabric SQL Database. The database project also contains pre- and/or post-deployment scripts that need to be executed as part of the deployment process. How can this goal be achieved?

Click through for the answer.

Leave a Comment

Microsoft Fabric Shortcuts and Lakehouse Maintenance

Dennes Torres has a public service announcement:

I wrote about lakehouse maintenance before, about multiple lakehouse maintenancespublished videos about this subject and provided sample code about it.

However, there is one problem: All the maintenance execution should be avoided over shortcuts.

The tables require maintenance in their original place. According to our solution advances, we start using shortcuts, lots of them. Our maintenance code should always skip shortcuts and make the maintenance only on the tables.

Click through to see how you can differentiate shortcuts from actual tables and write code to avoid shortcuts.

Leave a Comment

An Overview of Real-Time Intelligence in Microsoft Fabric

Christopher Schmidt lays out a use case:

Operational reporting and historical reporting serve distinct purposes in organizations. Historically, data teams have heavily leaned on providing historical reporting, as being able to report on the operational business processes has proved elusive.  

As a result, organizations have created reports directly against the operational database for operational needs or spend significant effort trying to get analytical tools to refresh faster using ‘micro-batching’ and/or keeping a tool like Power BI in directQuery mode. These efforts come with the goal of ‘moving data through the system as fast as possible’. 

Click through for an architecture diagram and an example scenario.

Comments closed

Default Tenant Settings Changes in Microsoft Fabric

Nicky van Vroenhoven notices a change:

In case you have access to the M365 Admin Center, or more specific the M365 Message Center, you might have seen this message. I reckon not many people did.. That’s why I’m blogging about it here

I’m specifically talking about this message in the Message Center, being a major update and with admin impact

Communications on default checkbox changes on tenant settings and billing start for SQL database in Fabric.

Read on for more information about what’s changing.

Comments closed

Spark Connector for Fabric Data Warehouse

Arshad Ali announces a connector:

We are pleased to announce the availability of the Fabric Spark connector for Fabric Data Warehouse (DW) in the Fabric Spark runtime. This connector enables Spark developers and data scientists to access and work with data from Fabric DW and the SQL analytics endpoint of the lakehouse, either within the same workspace or across different workspaces, using a simplified Spark API. The connector will be included as a default library within the Fabric Runtime, eliminating the need for separate installation.

Click through to check out its capabilities. This is a tiny step toward where I think Microsoft Fabric should go: any tool accessing the same data, eliminating separate lakehouses vs warehouses and having to remember that you can’t use this syntax in this scenario unless you connect to it this way and sacrifice one live chicken.

Comments closed