Press "Enter" to skip to content

Month: April 2025

An Introduction to Temporal Tables

Stephen Planck covers a feature in SQL Server:

Have you ever wanted to see exactly how a row looked in your SQL Server database a week or a month ago—without writing complicated auditing code or triggers? With temporal tables (also known as system-versioned tables), you can do precisely that. Temporal tables automatically track historical data changes by maintaining multiple versions of each row in a separate history table. Let’s explore how they work, why you might use them, how to set them up, and what best practices to follow.

Click through for a good overview of the feature. I have mixed feelings on the feature because I think it’s halfway-finished and has been since 2016. But there are two things that I think the feature really needs to shine.

The first is user-defined versioning. SQL Server only offers system time for version tracking, meaning it keeps track of when you insert the row into the table. That’s fine for certain historical operations, but terrible if you want to use temporal tables for type-2 slowly changing dimensions, where you often care more about when a record became effective rather than when you inserted the row into the dimension.

The second is efficient historical slicing the same way you can do AS OF operations. AS OF lets you ask questions about what the data looked like at a specific point in time. For warehousing scenarios, we also want to look at the history of changes for a particular keyed record, so you might see all of the changes to a customer or an employee. You can do this with a UNION ALL operation, but that query logic can get complex.

Comments closed

Estimating SQL Server Backup Sizes

Rebecca Lewis gives us a rule of thumb:

How big will my backup file be?  Very good question.  And of course, it depends… especially if you’re using compression.

I was working something today that required a backup beforehand for safety measures, and the server had limited storage.  So, I needed to estimate the backup file size beforehand to confirm whether I had enough space for the bak file.

Click through for the script. It does, of course, include some simplifications and your actual numbers can turn out a bit different, but for a quick and dirty estimate of disk space needed, this isn’t bad at all.

Comments closed

Why Use XACT_ABORT?

Josephine Bush answers a team member’s question:

This came up one day at my work when a developer was using it. I hadn’t used it before and thought I’d better check it out. It’s off by default in SQL server, but why would you use it?

Click through for the tl;dr version, followed by a more thorough explanation. I wouldn’t set XACT_ABORT on by default, but there have been plenty of instances in which I’ve wanted to handle a lengthier series of operations as an all-or-nothing job, and this is a great way to do it.

Comments closed

Creating a Microsoft Fabric Capacity

Boniface Muchendu builds out some capacity:

To begin, we need to head over to the Azure portal. You might wonder why we are starting here. Well, Microsoft Fabric is now an Azure resource, which means all initial setups must be done in the Azure environment.

Click through for step-by-step instructions. Microsoft has also been really good about letting people create (and re-create and re-create) trial capacities, so if you’re just futzing about with the product to get an idea of what it can do, see if you can use that rather than shelling out the cash.

Comments closed

Feeding Language Models Bad Advice

Louis Davidson begins an experiment:

So, I got this idea to test out a few LLM, ChatGPT and the Web and Office Copilot at the very least and see how they handle a load of bad advice. So I put out a question on X, asking:

“A request! Send me your most realistic, but worst, SQL Server management advice. I want to test (and write an article) about using AI to fact check writing.”

And if there’s anything this community is good at, it’s providing bad advice for purposes of lampooning.

We are going to need to wait a week to see Louis’s results, but you can check out some of the terrible advice a variety of X users proffered.

Comments closed

Snowflake Query Tags in Power BI

Chris Webb takes some of the shine off of things:

Since the November 2024 Power BI release blog post announced that queries sent to Snowflake by Power BI include a query tag I’ve had a lot of questions from people who couldn’t see this happening or wanted to know what the query tags contained, so in this blog I thought I would outline the current status.

It turns out that the query tag isn’t as far along as the blog post indicated, and there are some pretty big limitations in the cases in which there actually is tagging.

Comments closed

What’s New in R 4.5.0

Russ Hyde checks out the changes:

R 4.5.0 (“How About a Twenty-Six”) was released on 11th April, 2025. Here we summarise some of the interesting changes that have been introduced. In previous blog posts we have discussed the new features introduced in R 4.4.0 and earlier versions (see the links at the end of this post).

The full changelog can be found at the r-release ‘NEWS’ page and if you want to keep up to date with developments in base R, have a look at the r-devel ‘NEWS’ page.

There are some nice bits of functionality on the list, so check it out.

Comments closed

Customizing Spark Settings in Microsoft Fabric Workspaces

Nikola Ilic doesn’t accept the default:

In this article, I’ll walk you through how to go from out-of-the-box default Spark configurations to a fine-tuned setup that suits your specific workloads and requirements, as well as getting you ready for the DP-700 exam.

Spark is an extremely powerful engine, but like any powerful tool, it runs best when you tune it. So, don’t always settle for default. Get dynamic—and get Spark working the way you need it to.

Click through for the explanation of functionality.

Comments closed

Backups Aren’t Enough

Kevin Hill lays out a common but very important argument:

Many IT leaders and system admins think, “We have full backups every night. We’re covered.” But when the time comes to restore, they discover:

· The backup file is corrupt.
· The storage location is inaccessible.
· The restore process takes way longer than expected.
· The recovery model wasn’t configured properly.
· The point-in-time restore doesn’t actually bring back the data they need.

At that point, it’s not a “backup strategy.” It’s a data loss incident.

The solution is to test those backups, and Kevin provides some guidance on how, as well as additional important parts of the story.

Comments closed

Microsoft Fabric Extensions for VS Code

Sunitha Muthukrishna announces a new trio of VS Code extensions:

Microsoft Fabric is changing how we handle data engineering and data science. To make things easier, Microsoft added some cool extensions for Visual Studio Code (VS Code) that help you manage Fabric artifacts and build analytical applications.

By adding these Microsoft Fabric extensions to VS Code, developers can quickly create Fabric solutions and manage their data setups right from their coding environments. Here, we’ll look at these extensions and show why they’re useful.

Click through for notes on the three extensions that are available. Note that two of them are still in preview.

Comments closed