Press "Enter" to skip to content

Day: April 10, 2026

Reviewing Blocking and Deadlocking

Erik Darling digs into more functionality in his performance monitoring tool:

In this video, I delve into the blocking and deadlock monitoring capabilities of FreeSQL Server Performance Monitoring, a tool I’ve developed and made available on GitHub. With a focus on practicality and ease-of-use, I explain how we leverage extended events for both blocking and deadlock scenarios, ensuring that you can identify and address performance issues efficiently. Whether you’re using system health or prefer to rely on the block process report, my monitoring tool streamlines the process by automatically setting up necessary configurations and providing easy-to-understand visualizations of your SQL Server’s performance trends over time.

Click through for the video and transcript.

Leave a Comment

Defining a Data Contract

Buck Woody becomes accountable:

A businessperson pulls a report from a data warehouse, runs the same query they’ve used for two years, and gets a number that doesn’t match what the finance team presented at yesterday’s board meeting. Nobody changed the report. Nobody changed the dashboard. But somewhere upstream, an engineering team renamed a field, shifted a column type, or quietly altered the logic in a pipeline, and nobody thought to mention it because there was no mechanism to mention it.

While we think of this as an engineering failure, it’s more of an implied contract failure. More precisely, it’s the absence of a formal contract. Data contracts are one of the most practical tools a data organization can adopt, and one of the most underused. The idea is not complicated: a data contract is a formal, enforceable agreement between the team that produces data and the team that consumes it. It defines what the data looks like, what quality standards it must meet, who owns it, and what happens when something changes. Think of it as the API layer for your data, the same guarantee a software engineer expects from a well-documented endpoint, applied to the datasets and pipelines your business depends on. This post is about why that matters at the CDO level and how to get them put in place.

Click through to learn more about what data contracts are and why they are useful. This post stays at the architectural level rather than the practitioner level, but lays out why it’s important to think about these sorts of things.

Leave a Comment

Mirroring SQL Server 2025 to Microsoft Fabric

Reitse Eskens digs in:

Maybe you’ve read my blog post in the DP-700 certification series about mirroring data. You can find that one here. This blog will be longer and more technical. And involve SQL Server. To make reading a little easier, I’ve listed the Microsoft Learn pages at the end of this blog post.

While writing the DP-700 post, I realised I wanted to dig a little deeper. Not only because I’m presenting a session on this subject, but also to learn more about the processes behind it. And, there’s SQL Server involved, something I still have a soft spot for in my heart. Or maybe even more than that.

The fact that your SQL Server instance has to be Arc-enabled is a bit annoying.

Leave a Comment

Updates to Fabric Eventstream

Alicia Li and Arindam Chatterjee share some updates:

Over the first quarter of 2026, Fabric Eventstreams shipped meaningful improvements across three themes that have repeatedly come up in feedback from our broad community of customers and partners: broader connectivityricher real-time processing, and secure enterprise‑ready networking and operations.

This post highlights some of the most impactful new Eventstreams-related features and capabilities delivered between January and March 2026.

Click through to see what’s new. Some of this is GA, though a good amount is in preview.

Leave a Comment