Considerations when Deleting Lots of Data

Kevin Feasel

2019-05-16

T-SQL

Ed Elliott takes us through things to think about before deleting a few million rows from a table:

Fragmentation
Fragmentation occurs when we delete from pages, and there is still data surrounding our data. If we have 100 rows and delete every odd row, we would have perfect fragmentation in that we have doubled the size of the data that we need. If we delete rows 1-49, even though we remove the same number of rows we don’t have any fragmentation as the data is in a continuous block. Knowing how the data is stored on disk and how the data will be deleted, is it the first x records or every x record is vital so that we know whether, after the delete, we should also reorganise the indexes to remove the deleted records.

Ed has quality insights here, so check it out.

Related Posts

Simple Query Zen

Erik Darling wants you to simplify your life queries: See, when a query is big and complicated to you, there’s a pretty good chance you’re gonna get a big and complicated query plan, because it’s big and complicated to the optimizer, too. This isn’t to say the optimizer is dumb or bad or ugly; it’s […]

Read More

Finding Broken Code in SQL Server

Pamela Mooney shows us how we can find broken code on our SQL Server instances: Before we approached our last major SQL Server upgrade, I was curious about what might break.  Yes, I had used the DEA to check our code against deprecated or discontinued code.  But I am talking about code that might not […]

Read More

Categories

May 2019
MTWTFSS
« Apr Jun »
 12345
6789101112
13141516171819
20212223242526
2728293031