Press "Enter" to skip to content

Category: Microsoft Fabric

Sending Azure Cost Management Data to Azure Data Explorer

Brad Watts writes out some cost data:

Understanding your Azure Spend is one of the most important things you do as an Azure customer. Azure Cost Management is built into the platform to provide you insights. But we live in a world of data and looking at the Azure Cost Management data in a silo may not meet your organization’s needs. In those situations, we can solve that need by putting your Cost Management data into an analytical platform like Azure Data Explorer or Microsoft Fabric KQL Database. Here we can bring in or join additional data that’s useful, run ad-hoc queries and build visualization tying it all together.

Using the below repository, you’ll be able to utilize Azure Cost Management exports to setup an automated process that ingests the cost data into ADX or Fabric KQL Database.

There are several steps involved, but as Brad points out, you can do this either with Microsoft Fabric or with classic Azure Data Factory + Azure Data Explorer. I’d also throw in Azure Synapse Analytics, but that’s not as in vogue anymore.

Werner Zirkel also has a great comment showing how you can cut out most of the steps with Event Grid.

Comments closed

Microsoft Fabric Presentations

Wolfgang Strasser opens a vault:

Are you searching for Microsoft Fabric Presentations? You want learn more about the new unified analytics solution?

There are plenty of presentation available around the internet – some only as recordings, some as PDFs only.

BUT – last week, I found a (now not more) hidden gem of Microsoft Fabric content on the internet – the Microsoft Fabric Readiness repository

Click through for the link to those presentations.

Comments closed

Enabling Staging for Microsoft Fabric Dataflows

Chris Webb shares some thoughts:

If you read this post that was published on the Fabric blog back in July, you’ll know that each Power Query query in a Fabric Gen2 dataflow has a property that determines whether its output is staged or not – where “staged” means that the output is written to the (soon-to-be hidden) Lakehouse linked to the dataflow, regardless of whether you have set a destination for the query output to be written to. Turning this on or off can have a big impact on your refresh times, making them a lot faster or a lot slower.

Chris shares a simple example of when staging might not be reasonable. This is going to be the less common scenario, however.

Comments closed

Querying the Power BI REST API from Fabric Spark

Gerhard Brueckl makes the call:

Microsoft Fabric has a lot of different components which usually work very well together. However, even though Power BI is a fundamental part of Fabric, there is not really a tight integration between Data Engineering components and Power BI. In this blog post I will show you an easy and reusable way to query the Power BI REST API via Fabric SQL in a very straight forward way. The extracted data can then be stored in the data lake e.g. to create a history of your dataset refreshes, the state of your workspaces or any other information that is provided by the REST API.

Click through for a list of operations, followed by the code you’ll need to pull this off.

Comments closed

Multiple Workspaces and Microsoft Fabric Git Integration

Kevin Chant can’t stop at one:

In this post I want to cover working with Microsoft Fabric Git integration and multiple workspaces. By highlighting one method that you can use in the real-world.

I must admit that I have been very keen to test this particular way of working with Microsoft Fabric Git integration and multiple workspaces.

By the end of this post, you will know one way that you can work with Microsoft Fabric Git integration and multiple workspaces. Based on real-world working practices. Including multiple branches and pull requests.

Click through to see what Kevin has in mind usingg Azure DevOps.

Comments closed

Storing Log Analytics Data in the Microsoft Fabric Lakehouse

Gilbert Quevauvilliers needs a place to store this data:

Following on in my series, in this blog post I am going to use the dataflow Gen2 in Microsoft Fabric to load the data into a lake house table.

By doing this, it will allow me to store the data in a delta lake table.

In this series I am going to show you all the steps I did to have the successful outcome I had with my client.

Click through for links to the first two parts of the series, as well as a step-by-step guide for part 3.

Comments closed

Comparing the Microsoft Fabric Data Wrangler and Power Query Editor

Reza Rad performs a comparison:

Power Query Editor and Data Wrangler are data transformation and preparation tools in Microsoft Fabric. There are similarities between these two tools. However, there are differences, too. It is essential to know the capabilities of each tool to understand which one should be used for what purpose and scenario. In this article, this is our quest.

Reza includes a video and an article. Reza also has a summary chart at the bottom.

Comments closed

Accessing OneLake Files from Power BI Desktop

Marc Lelijveld reads a file:

Fabric content is all over the place by now. In Fabric, as a SaaS platform, most (if not all) services have interconnectivity. In a few clicks you connect your web-developed Power BI dataset to a lakehouse, or warehouse to fetch data from OneLake. But what about Power BI Desktop? You might have uploaded some files to OneLake which you cannot access from Power BI Desktop.

In this blog I’ll explain on how you can connect to OneLake data using Power BI Desktop!

This turns out to be a bit trickier than I would have expected. Hopefully the experience gets better over time.

Comments closed

Git Integration for Power BI Reports in Microsoft Fabric

Kevin Chant gives GIt integration a try:

To manage expectations, this post covers:

  1. Brief overview of Microsoft Fabric Git integration.
  2. How I converted a Power BI report to a Power BI Desktop project containing metadata files.
  3. Converting the folder that contains the Desktop project into a Git repository.
  4. Synchronizing the Git repository with Azure DevOps.
  5. Setting up Microsoft Fabric Git integration.
  6. Initial tests.
  7. Interesting workaround to deploy a second Power BI report using metadata.

Read on for Kevin’s thoughts.

Comments closed