Press "Enter" to skip to content

Curated SQL Posts

SQL Server 2019 and Row Mode Memory Grant Feedback

Yvonne Vanslageren explains a solid feature in SQL Server 2019:

When SQL Server runs a query, it needs memory for operations like sorting and joining data. It also relies on memory during query compilation to hold intermediate plans while the Query Optimizer finds the best execution strategy. In parallel processing scenarios, the memory requirement grows even further.

SQL Server manages this by pre-allocating memory for each query through the SQL Server Operating System (SQLOS). This process ensures that no single query can monopolize the server’s memory.

Read on to learn more about memory grants, problems you can run into with memory grants, and one way SQL Server 2019 has improved to reduce the risk of bad memory grant estimates.

Comments closed

The Contents of a Database Page in SQL Server

Simon Frazer takes a peek:

In SQL Server, data is stored in tables. Behind the scenes, however, these tables are divided into 8-kilobyte (8 KB) units called pages. Each page is 8,192 bytes in size, and this is a fixed value that cannot be changed.

Out of the 8,192 bytes, the first 96 bytes are reserved for the page header, which stores metadata about the page. This leaves 8,096 bytes for storing rows of data. However, a single row can only use up to 8,060 bytes, as 36 bytes are reserved for additional metadata, such as the slot arrayversioning tags, and forwarded record pointers.

Read on for more information about the fundamental unit of storage in SQL Server.

Comments closed

Repairing a SQL Server Instance

Vlad Drumea fixes an installation problem:

I’ve needed to run the repair process in the following cases:

  • An instance or some of its components become corrupted or missing after OS patching.
    One of the most common examples I’ve ran into is SQL Server Configuration Manager being gone from the machine.
    Not to be confused with database corruption.
  • A failed or cancelled SQL Server instance upgrade or patching.
  • SQL Server failing to uninstall.
    From what I’ve seen, this tends to be a side-effect of the first scenario in this list going undetected.

Read on to see what you need to have and how to perform the task.

Comments closed

Three Incremental Load Patterns with Azure Data Factory

Temidayo Omoniyi likes a good pattern:

This article is divided into three major sections—each showing the different abilities and use cases of performing incremental load with Azure Data Factory. This process can also be done in an Azure Synapse Pipeline and Fabric Pipeline.

The document contains the following:

Section 1: Copy Data Based on Last Modified Date or Latest File

Section 2: Incremental Copy Using Dataflow

Section 3: Incremental Copy Using Lookup and Stored Procedure Activities

Click through for each of these three patterns, with plenty of screenshots and step-by-step instructions.

Comments closed

SSIS Deprecating Microsoft Connector for Oracle

Debbi Lyons has an announcement:

In July 2025, Microsoft will discontinue support for the Microsoft Connector for Oracle in SQL Server Integration Services (SSIS). This blog provides essential details to help customers prepare for this change in advance.

The Microsoft Connector for Oracle enables data export from and import into Oracle databases within an SSIS package. This feature, available in Enterprise editions of SQL Server 2019 and 2022, will remain functional for the lifecycle of the SQL Server product. However, support for this feature will officially end on July 4, 2025. With the deprecation, future product releases will provide no further bug fixes. Additionally, it will not be supported from SQL Server 2025 and onwards.

See, and people have told us there hasn’t been anything happening in SSIS since 2016!

The alternative of using ADO.NET reminds me of when Microsoft tried to take OLEDB out of Integration Services and got the pushback that no, we really don’t want to move from a fast component to a slow component. I would expect much less pushback on this one, simply because I doubt many people are using SSIS to ferry around data in Oracle.

Comments closed

Handling Excel Files in OneLake via File Explorer

Kristyna Ferris needs to get a file:

Hey data friends! This blog is to discuss an edge case I’ve run into in Microsoft Fabric. I won’t go into all the context, but the goal was to have an Excel file accessible to Microsoft Fabric without OneDrive, SharePoint, nor an on-premises data gateway. We also didn’t want a csv because we wanted to have multiple tabs and structured tables with formulas which won’t save properly in csv files.

So how did we do it? OneLake!

Read on to learn how.

Comments closed

Using the OUTPUT Clause

Erik Darling has a new video. Erik mentions the best use case of this being for archival tables, but I’ll add one more: if you’re using a queue table (ignoring how good or bad of an idea this is), you have multiple processes operating on this queue table, and you want to reduce the likelihood of two processes picking up the same value, you can perform the equivalent of popping off of a queue: delete the first element(s) from the queue table and output it into a temp table. From there, you can operate on that data at your leisure, and the next process will grab some other batch of data. And if everything goes mildly wrong, re-insert that data back into the queue and let some other sucker try it. I’ve used this a few times for data warehousing processes and it works out pretty well.

The only thing I’m unsure about is how he figured out that I’m CommonTableExpressionLover11357.

1 Comment

Streaming Data to Azure Event Hub via Mockaroo and Kafka API

Jasleen Kaur Wahi generates some data:

In a recent project, I faced the need to generate randomized data for transmission to the Azure Event Hub. This hub is a key component of Microsoft Azure, used for real-time data ingestion and processing.

First, let’s take look at how I created this random data. I wanted to come up with a way to make data that looks like what we see in the real world, but without using any real information from users. This made-up data was really important for a bunch of things, like checking if our software works well.

Read on to see how Mockaroo works and the end result. Creating tests for streaming services like Event Hubs is a challenge, so this is an interesting approach to the task.

Comments closed