Press "Enter" to skip to content

Day: January 30, 2026

Automatic Stats Updates and Plan Cache Invalidation

Brent Ozar threads the needle:

Normally, when SQL Server updates statistics on an object, it invalidates the cached plans that rely on that statistic as well. That’s why you’ll see recompiles happen after stats updates: SQL Server knows the stats have changed, so it’s a good time to build new execution plans based on the changes in the data.

However, updates to system-created stats don’t necessarily cause plan recompiles.

Read on for the rare situation in which invalidation doesn’t happen.

Also, Brent has me wondering if the lemon popcorn is more sweet/tart (like a lemon bar) or savory (like lemon pepper) and I may have to try both styles.

Leave a Comment

Management Plane and Data Plane Permissions

Rebecca Lewis continues a series on cloud data platform security:

My last post ended with a promise to explain the management plane vs. data plane split in practical terms. The short story? There are two separate permission systems — one for managing resources, one for accessing data — and they don’t talk to each other. The good news: you already understand this model. You just don’t recognize it yet.

Click through for the explanation and a simile involving SQL Server’s security model.

Leave a Comment

Testing the Performance of Direct Lake vs Import Mode for Semantic Models

Gilbert Quevauvilliers performs some testing:

In this blog post I am going to show you how I completed the automated testing and then the results where I am going to compare Direct Lake, Import and DirectQuery and which one appears to be the best.

As always, your testing may very or be different to my tests below.

I would highly recommend that you use the method I have used and apply this testing technique to your data to understand and validate which semantic model would be best for your capacity.

Click through for details on the tests, query durations, and how the three major modes of data loading into Microsoft Fabric semantic models (Import, Direct Lake, Direct Query) fare.

Leave a Comment

Defining Technical Debt

Louis Davdison takes a favorite phrase of many an IT person:

Ah, the term “technical debt.” The implication of it is that you have this wonderful idea, this glorious design, and for time/money reasons, you said “we can’t achieve this.” I am not sure there has ever been a project that didn’t have technical debt. It happens in software, it happens in the real world. You probably have technical debt in your house, and huge companies like Disney make these glorious plans that never quite get finished.

Click through for a link to Louis’s video. As for my own definition of technical debt, I wrote a blog post about it a while back. As of this moment, the only part I might debate myself on is whether “It was a good decision at the time, but times have changed” is really technical debt or if it’s something else. From an ontological perspective, it’s probably a different category of thing. But from the standpoint of a practitioner with a code base or infrastructure as it is, I don’t know that it matters all that much whether we call it “technical debt” or “the ever-changing sands of time ruining all that is great.” Though definitely pull out the latter in a meeting when trying to explain to a PM why you need 40 hours of dev time to rewrite some code.

Leave a Comment

The Value of MERGE

Erik Darling defends the honor of the MERGE statement:

In this video, I delve into the often maligned `MERGE` statement in SQL Server, defending it against its critics and highlighting its unique benefits. You might be surprised to learn that `MERGE` can actually help catch non-deterministic updates—something a standard `UPDATE` query won’t do. By demonstrating how `MERGE` issues a warning when you attempt a non-deterministic update, I aim to show why this statement deserves more respect in your database toolkit. The video also covers practical solutions for making these updates deterministic using window functions like `ROW_NUMBER()`, ensuring that the data integrity and consistency you expect from SQL Server are maintained. Whether you’re looking to improve query reliability or just want to understand a lesser-used feature better, there’s something here for everyone who deals with complex data operations in SQL Server.

Alongside this, I recommend a post from Hugo Kornelis from a few years ago, looking into pretty much every problem people reported with MERGE and checking to see if it was still and issue and, if so, under what circumstances.

Leave a Comment

How SQL Server Uses the Service Master Key

Greg Low provides an explanation:

The Service Master Key (SMK) is one of the most misunderstood security components in SQL Server.

It is frequently described as:

  • The root of all encryption
  • The key that protects everything
  • Something you must back up constantly

Those statements are not entirely wrong – but they are incomplete, and in some cases actively misleading.

It’s important to understand what the Service Master Key actually does, when it matters, when it doesn’t, and why many operational decisions around it are based on myth rather than mechanics.

Read on to learn more about what the key is, what exactly it protects, and some practical guidance around managing it.

Leave a Comment