Press "Enter" to skip to content

Category: Microsoft Fabric

OneLake Security and the Fabric SQL Analytics Endpoint

Freddy Santos takes us through the latest with respect to security in OneLake:

OneLake Security centralizes fine-grained data access for Microsoft Fabric data items and enforces it consistently across engines.
Currently in Preview and opt-in per item, it lets you define roles over tables or folders and optionally add Row-Level Security (RLS) and Column-Level Security (CLS) policies. These definitions govern what users can see across Fabric experiences.

Read on to see what you can do.

Comments closed

Microsoft Fabric Direct Lake Join Index Creation

Phil Seamark explains a recent change:

If you’ve been working with Direct Lake in Microsoft Fabric, you’ll know its magic resides in its ability to quickly load data. It loads data into semantic models from OneLake when needed. This feature eliminates the overhead of importing. But until recently, the first query on a cold cache might feel sluggish. Why? One reason for this is that Direct Lake must build a join index. This index is added to the model during the first query. This index is a critical structure that maps relationships between tables for efficient lookups.

Earlier, this process was single-threaded and slow, especially on large tables with high cardinality. The good news? That’s changed.

Read on to see how, what a join index is, and what this impact looks like in practice.

Comments closed

Learning Microsoft Fabric Real-Time Intelligence

Valerie Junk picks up a new skill:

If you are reading this article on my website, chances are you know me from my Power BI content, the videosarticlestutorials, or downloads, or you came across it on LinkedIn. I want to be upfront: I am a front-end/business person. I create reports that lead to action and help businesses make smarter decisions while building a data-driven strategy.

When I started talking about Fabric Real-Time Intelligence, people were surprised. Some were curious. Others probably wondered what had happened. For me, real-time reports push you to approach design in a completely different way because users need to take action immediately. Decisions happen in the moment, and that changes everything about how you visualize and structure information, so that got me interested!

Read on to see how Valerie picked up KQL as a language, as well as some of the challenges involved. I will say, the Eventhouse is also the fastest mechanism Microsoft has to query large amounts of data in Microsoft Fabric—it beats out the lakehouse and warehouse pretty handily.

Comments closed

Automating Power BI Load Testing via Fabric Notebook

Gilbert Quevauvilliers grabs a query:

Load testing is essential when working with Microsoft Fabric capacity. With limited resources, deploying a Power BI report without testing can lead to performance issues, downtime, and frustrated users. In this series, I’ll show you how to automate load testing using Fabric Notebooks, making the process faster, easier, and repeatable.

Inspired by Phil Seamark’s approach, this method eliminates manual complexity and allows you to capture real user queries for accurate testing.

Read on for the first part, in which Gilbert uses the Performance Analyzer to capture query details.

Comments closed

Copying Data across Tenants with Fabric Data Factory

Ye Xu makes use of the Copy job:

Copy job is the go-to solution in Microsoft Fabric Data Factory for simplified data movement, whether you’re moving data across clouds, from on-premises systems, or between services. With native support for multiple delivery styles, including bulk copy, incremental copy, and change data capture (CDC) replication, Copy job offers the flexibility to handle a wide range of data movement scenarios—all through an intuitive, easy-to-use experience. Learn more in What is Copy job in Data Factory – Microsoft Fabric | Microsoft Learn.

With Copy job, you can also perform cross-tenant data movement between Fabric and other clouds, such as Azure. It also enables cross-tenant data sharing within OneLake, allowing you to copy data across Fabric Lakehouse, Warehouse, and SQL DB in Fabric between tenants with SPN support. This blog provides step-by-step guidance on using Copy job to copy data across different tenants.

Click through for a demonstration, as well as the security permissions that are necessary for this to work.

Comments closed

Checking Direct Lake Model Sources

Nikola Ilic wants to know if Direct Lake is using OneLake or SQL:

In my recent Microsoft Fabric training, I’ve been explaining the difference between the Direct Lake on OneLake and Direct Lake on SQL, as two flavors of Direct Lake semantic models. If you are not sure what I’m talking about, please start by reading this article. The purpose of this post is not to examine the differences between these two versions, but rather to clarify some nuances that might occur. One of the questions I got from participants in the training was:

“How do we KNOW if the Direct Lake semantic model is created as a Direct Lake on OneLake or Direct Lake on SQL model?”

Read on for that answer.

Comments closed

Job-Level Bursting in Microsoft Fabric Spark Jobs

Santhosh Kumar Ravindran announces a new feature:

  • Enabled (Default): When enabled, a single Spark job can leverage the full burst limit, consuming up to 3× CUs. This is ideal for demanding ETL processes or large analytical tasks that benefit from maximum immediate compute power.
  • Disabled: If you disable this switch, individual Spark jobs will be capped at the base capacity allocation. This prevents a single job from monopolizing the burst capacity, thereby preserving concurrency and improving the experience for multi-user, interactive scenarios.

Read on for the list of caveats and the note that it will cost extra money to flip that switch.

Comments closed

Monitoring Microsoft Fabric Costs

Chris Webb uses a report:

Following on from my blog post a few months ago about cool stuff in the Fabric Toolbox, there is now another really useful solution available there that anyone with Fabric capacities should check out: Fabric Cost Analysis (or FCA). If you have Fabric capacities it’s important to be able to monitor your Azure costs relating to them, so why not monitor your Fabric costs using a solution built using Fabric itself? This is what the folks behind FCA (who include Romain Casteres, author of this very useful blog post on FinOps for Fabric, plus Cédric Dupui, Manel Omani and Antoine Richet) decided to build and share freely with the community.

Click through to see how it works, and check out the FCA link in the graf above to get the code.

Comments closed

Customer-Managed Keys in Microsoft Fabric

Sumiran Tandon makes an announcement:

Customer managed keys were launched in preview, offering workspace administrators the ability to use keys in Azure Key Vault and Managed HSM, to protect data in certain Fabric items. Now, we are extending the encryption support to more Fabric workloads. You can now create Fabric Warehouses, Notebooks and utilize the SQL Analytics Endpoint in workspaces enabled with encryption using your keys. The changes are rolling out and should be available in all regions over the next few days.

Freddie Santos digs into what this means for Fabric Warehouse and the SQL analytics endpoint:

Fabric already ensures that your data is encrypted at rest using Microsoft-managed keys. But for many organizations—especially in regulated industries—encryption alone isn’t enough. They need the ability to control and manage the keys that protect their data, aligning with internal compliance requirements, regulatory standards, and governance best practices.

I know that there are enough companies where this is absolutely necessary for adoption of a product, but I should point out that even without bringing your own key, Microsoft does use their own generated keys to encrypt your data at rest.

Comments closed