Press "Enter" to skip to content

Category: KQL

Medallion Architecture in Fabric Real-Time Intelligence

Tyler Chessman is like an onion:

Building a multi-layer, medallion architecture using Fabric Real-Time Intelligence (RTI) requires a different approach compared to traditional data warehousing techniques. But even transactional source systems can be effectively processed in RTI. To demonstrate, we’ll look at how sales orders (created in a relational database) can be continuously ingested and transformed through a RTI bronze, silver, and gold layer.

Read on to see how.

Leave a Comment

Digging into the Kusto Detective Agency

Tom Zika becomes a gumshoe:

We need to answer this question:

Who is the detective that earned the most money in 2022?

We can see that only one table (DetectiveCases) was added in the ingestion section. Let’s take a look at its data.

The Kusto Detective Agency is a great way to apply KQL skills. I’m not sure it’s a fantastic experience for somebody with zero KQL knowledge, but if you’ve messed around at least a little bit with the language, this is a fun way of applying those skills.

Leave a Comment

Proactive Monitoring in Microsoft Fabric via Activator

Someleze Diko shows off a powerful feature in Microsoft Fabric:

Driving actions from real-time organizational data is important for making informed data-driven decisions and improving overall efficiency. By leveraging data effectively, organizations can gain insights into customer behaviour, operational performance, and market trends, enabling them to respond promptly to emerging issues and opportunities.

Setting alerts on KQL queries can significantly enhance this proactive approach, especially in scenarios such as customer support. For instance, by monitoring key metrics like response times, ticket volumes, and satisfaction scores, support teams can identify patterns and anomalies that may indicate underlying problems.

This helps drive home an important mental shift around “real-time intelligence.” Ignoring my standard disdain for misuse of the term “real-time,” most people will ignore the feature because of a perfectly reasonable belief: my data doesn’t come in that frequently, so I don’t really need to process it in near-real-time. But the real-time intelligence functionality isn’t necessarily just about loading in your data and making it available to users faster. Instead, think of it as acting immediately when your data does change, especially if you have multiple sources of data loading at different times during the day.

Comments closed

An Overview of Real-Time Intelligence in Microsoft Fabric

Christopher Schmidt lays out a use case:

Operational reporting and historical reporting serve distinct purposes in organizations. Historically, data teams have heavily leaned on providing historical reporting, as being able to report on the operational business processes has proved elusive.  

As a result, organizations have created reports directly against the operational database for operational needs or spend significant effort trying to get analytical tools to refresh faster using ‘micro-batching’ and/or keeping a tool like Power BI in directQuery mode. These efforts come with the goal of ‘moving data through the system as fast as possible’. 

Click through for an architecture diagram and an example scenario.

Comments closed

A Dive into Microsoft Fabric Real-Time Intelligence

Nikola Ilic builds us a guide:

Once upon a time, handling streaming data was considered an avanguard approach. Since the introduction of relational database management systems in the 1970s and traditional data warehousing systems in the late 1980s, all data workloads began and ended with the so-called batch processing. Batch processing relies on the concept of collecting numerous tasks in a group (or batch) and processing these tasks in a single operation. 

On the flip side, there is a concept of streaming data. Although streaming data is still sometimes considered a cutting-edge technology, it already has a solid history. Everything started in 2002, when Stanford University researchers published the paper called “Models and Issues in Data Stream Systems”. However, it wasn’t until almost one decade later (2011) that streaming data systems started to reach a wider audience, when the Apache Kafka platform for storing and processing streaming data was open-sourced. The rest is history, as people say. Nowadays, processing streaming data is not considered a luxury, but a necessity. 

This is all part of a book that Nikola and Ben Weissman are writing, and Nikola has an extended excerpt from the book available for us to read.

Comments closed

Kusto Query Performance in Microsoft Fabric

Dennes Torres checks some stats:

We already discovered how to investigate Kusto query history. Let’s discover how to analyse query performance considering the information on this history.

The query history returns 3 fields we can use to make a more detailed analysis of the queries: CachedStatisticsScannedExtentsStatistics and ResultsetStatistics.

Disclaimer: There are low to no documentation about this content. In this way, the content below may not be 100% precise but will give you good guidance.

Click through to learn more about these three.

Comments closed

A Quick Primer on KQL

Reitse Eskens takes us through a language:

This post can come as a shock if you’re used to writing T-SQL. Because not only is there more than one useful language to process data, realtime data in this case, but it also has enough similarities to SQL to look familiar and is different enough to leave you flustered.

Now, to get a complete introduction into KQL or the Kusto Query Language, one blogpost (or video) would never be enough. There are so many operators that can fill an entire series on their own.

In this blog, the focus will be on the basic structure of KQL and a number of common operators. They will be compared with the counterparts in SQL for reference.

I fully agree with Reitse. I’ve put together a full-length talk on KQL and still feel like I’m covering the basics. It’s not that KQL is some monstrously complicated language, but it is different enough from other languages like T-SQL that you cannot easily apply knowledge from one to the other.

Comments closed

Sending Alerts from Fabric Workspace Monitoring

Chris Webb has a new Bat-signal:

I’ve always been a big fan of using Log Analytics to analyse Power BI engine activity (I’ve blogged about it many times) and so, naturally, I was very happy when the public preview of Fabric Workspace Monitoring was announced – it gives you everything you get from Log Analytics and more, all from the comfort of your own Fabric workspace. Apart from my blog there are lots of example KQL queries out there that you can use with Log Analytics and Workspace Monitoring, for example in this repo or Sandeep Pawar’s recent post. However what is new with Workspace Monitoring is that if you store these queries in a KQL Queryset you can create alerts in Activator, so when something important happens you can be notified of it.

Read on to learn more.

Comments closed

Analyzing Semantic Model Logs via Microsoft Fabric

Sandeep Pawar parses the logs:

Workspace Monitoring was one of my favorite announcements at MS Ignite ‘24 this week. It logs events from Fabric items such as Semantic Models, Eventhouse, GraphQL to a KQL database that’s automatically provisioned and managed in that workspace. Currently it’s limited to these three items but hopefully other (especially spark and pipelines) will be added soon. Read the announcement by Varun Jain (PM, Microsoft) on this for details. 

Click through for some thoughts from Sandeep, as well as a variety of useful queries.

Comments closed

Querying a Fabric KQL Database via REST API

Sandeep Pawar grabs some data:

I have previously explained how to query a KQL database in a notebook using the Kusto Spark connector, Kusto Python SDK, and KQLMagic. Now, let’s explore another method using the REST API. Although this is covered in the ADX documentation, it isn’t in Fabric (with example), so I wanted to write a quick blog to show how you can query a table from an Eventhouse using a REST API.

Click through to see how you can do it. Sandeep’s code is in Python but because this is just hitting a REST API rather than using a library, you could also use some tool like Postman.

Comments closed