Press "Enter" to skip to content

Category: Microsoft Fabric

Sending Data from Power Automate to Microsoft Fabric

Chris Webb uses Eventstreams:

Fabric’s Real-Time Intelligence features are, for me, the most interesting things to learn about in the platform. I’m not going to pretend to be an expert in them – far from it – but they are quite easy to use and they open up some interesting possibilities for low-code/no-code people like me. The other day I was wondering if it was possible to send events and data from Power Automate to Fabric using Eventstreams and it turns out it is quite easy to do.

Read on to see just how easy it is. And there’s a good question from a reader about using other languages, such as Powershell. Turns out the answer is yes.

Comments closed

Power BI Writeback via Fabric SQL Database

Jon Voge gives us a use case for Fabric SQL Databases:

Until recently, Fabric has allowed us to choose between Lakehouses and Warehouses as a backend. For write-back use cases, neither are ideal.

  • The SQL Endpoint of Lakehouses are Read-Only, making writes from Power Apps impossible.
  • While the SQL Endpoint of Warehouses are write-enabled, they do not support enforced Primary Keys, which are a hard requirement for Power Apps to be able to write directly to a data source.

Jon briefly describes two mechanisms people used and then how you can do this more effectively with a Fabric SQL Database. Based on the article, it seems that you could probably still do the same with an Azure SQL Database, though I suppose handling the managed identity could be an issue.

Comments closed

Analyzing Semantic Model Logs via Microsoft Fabric

Sandeep Pawar parses the logs:

Workspace Monitoring was one of my favorite announcements at MS Ignite ‘24 this week. It logs events from Fabric items such as Semantic Models, Eventhouse, GraphQL to a KQL database that’s automatically provisioned and managed in that workspace. Currently it’s limited to these three items but hopefully other (especially spark and pipelines) will be added soon. Read the announcement by Varun Jain (PM, Microsoft) on this for details. 

Click through for some thoughts from Sandeep, as well as a variety of useful queries.

Comments closed

Obtaining VisualIDs for Visuals in a Power BI Report

Sandeep Pawar checks for ID:

Log Analytics and Workspace Monitoring in Fabric logs all the activities of datasets in a workspace. These logs contain dataset, report, visual IDs which the user has to decipher to get the full picture. Dataset, report ids are straightforward but it’s not easy to get visual IDs programmatically. Chris Webb already has a blog on couple of different ways to get the visual IDs. That blog was published in 2022 and in the Fabric world we now have a couple of more options.

Read on for two additional methods you can use.

Comments closed

Converting an Excel Workbook to CSV via Microsoft Fabric

Jared Westover builds a Data Factory job:

After a two-year break, I started working with Azure Data Factory again, now part of the Fabric family. I quickly adapted to Data Factory since it closely resembled SQL Server Integration Services (SSIS), a tool with which I had a love-hate relationship. For my new mission, I set out to convert a list of files from Excel to comma-separated values (CSV). We upload the original Excel files to a Data Lake in Fabric. We then need to convert a specific worksheet and move the CSV files to a different folder in Data Lake.

Read on to see what Jared came up with.

Comments closed

Working with the Microsoft Fabric API for GraphQL

Nikola Ilic parses some data:

“We are creating a custom dashboard using code, and we need the data stored inside Microsoft Fabric. Can we access it in another way than via SQL Analytics Endpoint?”

This is a real-life customer requirement we’ve encountered recently. And the short answer is: Yes, you can! For the longer answer, we encourage you to read this article and understand how to leverage the Fabric API for GraphQL feature for enhanced data retrieval experience compared to the traditional REST API approach.  

Click through for an excerpt from a book that Nikola and Ben Weissman are writing.

Comments closed

Setting a Default Destination for Fabric Dataflows Gen2

Jon Voge wants to spend less time copying and pasting:

Ever had a Dataflow Gen2 in which you needed to map the output of several queries to the same Warehouse or Lakehouse? Takes a while to setup, right?

If you wish to add a Default Destination to your Dataflow, all you need to do is to create the Dataflow from inside your desired destination. This works for both Warehouses, Lakehouses and KQL Databases:

Click through for an example of how it works.

Comments closed

SQL Database in Microsoft Fabric

Nikola Ilic covers a new addition to the Microsoft Fabric family:

Now, let’s get back to the previous point: SQL database in Fabric is a SaaS Azure SQL DB…Generally speaking, in SaaS solutions, “everything just works” (or at least should work) – without (too much) intervention from your side.

In the context of the SQL database in Fabric, creating a database is probably the most straightforward process of database creation you will ever experience, as I’ll show you in the “HOW TO” section of this article. From that point, everything happens automatically: the database will be automatically configured and will automatically scale both in terms of compute resources and storage. In addition, database backups are performed automatically, indexing also happens the same way, as well as all patches and software/hardware fixes. You want more? No more complex firewall rules and permission settings – this time, everything is done via Fabric workspace roles and item permissions, while the well-known SQL native features allow for more granular control.

This is more of a head-scratcher for me than a brilliant solution. I get that there’s a challenge in figuring out what you want with Azure SQL Database: single database or elastic pool, serverless or provisioned, vCore or DTU-based pricing model, General Purpose or Hyperscale or Business Critical (for vCore), Basic or Standard or Premium (for DTU), one of about five separate hardware configurations, etc.

From the standpoint of “I just want a database, please,” Fabric SQL Database is a lot easier. The problem comes in when you hit the use cases that necessitated all of these options to begin with, at which point you’re back to the original creation screen and outside of Fabric once more.

Comments closed

Querying a Fabric KQL Database via REST API

Sandeep Pawar grabs some data:

I have previously explained how to query a KQL database in a notebook using the Kusto Spark connector, Kusto Python SDK, and KQLMagic. Now, let’s explore another method using the REST API. Although this is covered in the ADX documentation, it isn’t in Fabric (with example), so I wanted to write a quick blog to show how you can query a table from an Eventhouse using a REST API.

Click through to see how you can do it. Sandeep’s code is in Python but because this is just hitting a REST API rather than using a library, you could also use some tool like Postman.

Comments closed

Tracking Column Sizes on DAX Queries

Chris Webb busts out the calculator:

I had meant to follow up my recent post on how to find the columns touched by a DAX query by writing one on how to use this technique to find the size of these columns in memory, so you can find the total size of the columns that need to be paged into memory when a DAX query runs on a Direct Lake semantic model. Before I could do that, though, my colleague Michael Kovalsky messaged me to say that not only had he taken the query from that first post and incorporated it in Semantic Link Labs, he’d done the work to get column sizes too. All that’s left for me to do, then, is give you some simple examples of how to use it.

Click through for those simple examples, though note that this requires Microsoft Fabric.

Comments closed