Press "Enter" to skip to content

Category: Microsoft Fabric

Materializing Lake Views in Microsoft Fabric

Sairam Yeturi reduces ETL and ELT requirements:

Organizations often face challenges when trying to scale analytics across large volumes of data stored in centralized SQL databases. As business teams demand faster, more tailored insights, traditional reporting pipelines can become bottlenecks. By adopting Lakehouse architecture with Microsoft Fabric, business groups can mirror their SQL data into OneLake and organize it using the Medallion architecture—Bronze, Silver, and Gold layers. Materialized lake views play a crucial role in this setup, enabling automated, declarative transformations that clean and enrich data in the Silver layer. This empowers teams to build reliable dashboards and AI-driven insights on top of curated data, all while maintaining performance, governance, and security on a scale.

In this post, we will cover how enterprises can use materialized lake views to streamline data orchestration and enhance data quality, monitoring across silver and gold layers, while mirroring their SQL DB tables to Fabric in the Bronze layer.

The best use case for this is a scenario in which your underlying data is already essentially in a star schema or at least easily transformable into one, and you have no interest in modifying the data in the view directly. Do read the limitations before digging in, though, as there are some big ones.

Leave a Comment

Passing Selections from Visuals to Translytical Task Flows

Jon Vöge sends along some data:

A common misconception about Translytical Task Flows is that the only way for you to parameterize and pass user inputs to the User Data Function, is through Slicers in Power BI.

That is not true at all.

In fact, one of the most powerful ways of integrating Task Flows into your Power BI reports, is by allowing user selections made in visualisations in your report, flow through to your task flow.

Read on to see how you can do this.

Leave a Comment

Microsoft Fabric Pipeline Copy Job Activity in Preview

Connie Xu makes an announcement:

We’re thrilled to announce that the Copy job Activity is now in Preview! 

This new orchestration activity brings the simplicity of the Copy job item directly into your Microsoft Fabric Data Factory pipelines, enabling you to manage data movement alongside transformations, notifications, and more; all in one place. 

Read on to learn more about it, including how it differs from the Copy activity and the Copy job item.

Leave a Comment

Microsoft Fabric Service Principal API Settings

Nicky van Vroenhoven has a public service announcement:

Microsoft Fabric is changing how service principal access to public APIs is controlled. The existing all-or-nothing tenant setting was split into two separate settings — giving us admins more granular control, but also introducing a change you might need to act on after August 1, 2025.

Click through to see how you might have been able to learn this, as well as the consequences of this change.

Leave a Comment

Community Resources for Power BI and Microsoft Fabric

Chris Webb highlights some community efforts:

There are a lot of really cool free, community-developed tools and resources out there for Power BI and Fabric – so many that it’s easy to miss announcements about them. In this post I thought I’d highlight a few that came out recently and which you might want to check out.

Click through for several good resources, and there are a couple of additional ones in the comments as well.

Leave a Comment

Item History in the Microsoft Fabric Capacity Metrics App

Ope Aladekomo announces a new feature:

We’re thrilled to announce the Preview of the Item History page in the latest version of the Microsoft Fabric Capacity Metrics App. The Item History page provides a 30-day compute usage analysis through dynamic visuals and slicers, enabling users to explore both high-level consumption trends and granular item-level metrics. This page helps you understand how individual items and operations contribute to overall capacity usage.

Click through to see a picture of the page, as well as some of the information you can glean from it.

Leave a Comment

Scheduling Copy Jobs in Microsoft Fabric

Ye Xu can run more than once:

Copy Job is the go-to solution in Microsoft Fabric Data Factory for simplified data movement. With native support for multiple delivery styles, including bulk copy, incremental copy, and change data capture (CDC) replication, Copy job offers the flexibility to handle a wide range of scenarios—all through an intuitive, easy-to-use experience.

In this update, we’re excited to announce a powerful new enhancement: multiple scheduler support. This gives you even greater control over when your data moves.

Click through for a screenshot showing how you can set up multiple schedules for a specific copy job. Based on the screenshot, it seems that there is a limit to the number of schedules you can create, though that number (20) is large enough that I couldn’t imagine it being a major impediment for most people.

Leave a Comment

Using the Tabular Object Model via Semantic Link Labs

Gilbert Quevauvilliers does a bit of connecting:

In this blog post I am going to show you how to use the powerful Semantic Link Labs library for Tabular Object Model (TOM) for semantic model manipulation.

The goal of this blog post is to give you an understanding of how to connect using TOM, then based on the documentation use one of the functions.

Don’t get me wrong the documentation is great, but when implementing it, it works a little differently and I want others to know how to use it, so it can automate and simplify some repetitive tasks.

Read on for the instructions and some of the things you can do with the Semantic Link Labs library in Microsoft Fabric.

Leave a Comment

Loading Data from Network-Protected Storage Accounts into OneLake

Matt Basile grabs some data:

AzCopy is a powerful and performant tool for copying data between Azure Storage and Microsoft OneLake, and is the preferred tool for large-scale data movement due to its ease of use and built-in performance optimizations. AzCopy now supports copying data from firewall-enabled Azure Storage accounts into OneLake using trusted workspace access. Now you can use AzCopy to load data from even network-protected storage accounts, letting you effortlessly load data into OneLake without compromising on security or performance.

Click through for an explanation of trusted workspace access, followed by the steps to try it out for yourself.

Leave a Comment

Equalizing Proxy vs Redirect Rates for OneLake Access

Elizabeth Oldag announces a pricing change:

We’re thrilled to share a major update and simplification to OneLake’s capacity utilization model that will make it even easier to manage Fabric capacity and scale your data workloads. We are reducing the consumption rate of OneLake transactions via proxy to match the rate for transactions via redirect. This means you no longer have to worry where you are accessing your OneLake data from (via proxy or redirect), they will consume your capacity at the same low rate.

Read on to see what this means in practice.

Leave a Comment