Press "Enter" to skip to content

Author: Kevin Feasel

Log Shipping

Richie Lee has a long blog post on log shipping:

Despite the development of AlwaysOn in recent releases of SQL Server, log shipping is still a great way to set up a copy of databases to be used for reporting. One of the main reasons it is great is because, unlike AlwaysOn, it is available in less expensive editions like Standard and Web from SQL Server 2008 onwards. Sure, in 2016 AlwaysOn will be available in Standard, but in a greatly deprecated form, and you cannot read from the secondary. So it will be good for DR, but not for reporting (as an aside it still might be easier to set up log shipping for DR than AlwaysOn Basic because you need to setup a failover cluster. Read through the “how to set up Standard Edition Availability Groups” here.) However you do need to be careful though when setting up log shipping across different editions of SQL Server: whilst you can log ship between Enterprise to Standard/Web, if the database uses any Enterprise features then you’ll need to log ship to an Enterprise edition of SQL Server. And because you’ll be using the database for reporting, you’ll need to get it licensed.

Log shipping is a venerable disaster recovery technique and it behooves database administrators to know of its existence.

Comments closed

Table Sampling

Ginger Grant shows a couple of techniques for sampling from tables:

The random sample that TABLESAMPLE provides is based on the number of data pages, not the number of records. If you want the number of rows to be specifically limited you will need to use Top(n) as well. I’ve written all my samples based upon AdventureWorksDW so you can run them for yourself later. I’ve listed the variety of ways to call TABLESAMPLE and shown the number of records returned.

TABLESAMPLE is useful for spelunking, but is somewhat limited otherwise.

Comments closed

Data Compression

Corey Beck on data compression:

Before we jump right into enabling either row or page compression, we can actually estimate the savings of each to determine which will provide us with the most savings on storage.  Since page compression includes row compression, we will start with row compression and the estimated savings.

EXEC sp_estimate_data_compression_savings
‘Person’,’Person’,null,null,’row’

In practice, data compression is extremely valuable and in most circumstances, the benefits outweigh the costs.  In certain workloads, you might even see CPU usage go down.

Comments closed

Watch Named, Nested Transactions

Gail Shaw finishes her outstanding series on transactions:

The error was thrown by the ROLLBACK statement. As such, the transaction is still open, the locks are held and the transaction log space can’t be reused. Unless the application that called this was checking for open transactions, that transaction could potentially be left open for quite some amount of time, causing blocking and/or the transaction log to grow.

It’s not just that someone in the future might call the code from another stored proc within a transaction, it’s also that it might be that the code is called from an application which started a transaction. Or called from SSIS which started a transaction. It’s very hard to ensure that code is never called from within an existing transaction

Read the whole thing.

Comments closed

Set SQL Server Startup Parameters With Powershell

Mike Fal has a function for managing SQL Server startup parameters:

Looking back at the previous blog post, changing the the startup parameters through the SMO is pretty easy with the ManagedComputer class. In some ways, it is too easy. As Shawn calls out, you could easily overwrite the full string and remove the startup locations for your master database (and breaking your instance). This is where tool building can be such an aid, because by wrapping the change code in a function, we can build some safety mechanisms to protect us (or others) from doing harm when trying to make this sort of change. The function I wrote is not terribly long, but I’ll spare you the whole thing by letting you view it on GitHub. We’ll use our time better by going over how I constructed it while focusing on some of my tool building principles.

Thanks to Mike for making that available to the community.

Comments closed

Dynamic Pivoting

Michael Bourgon has a pivot script:

Say you have records from your monitor that happens to Track EventLogs.
We’ll call it EventLog_Tracking for argument’s sake (hint hint http://thebakingdba.blogspot.com/2015/05/powershell-eventlogtracking-capturing.html) and want to look at trends over time.

Dynamic pivoting is possible (I have an example en passant in a tally table presentation I created a couple years back), but it’s not the easiest thing in the world.

Comments closed

Keep .Net Framework Up To Date

Allan Hirt with a public service announcement:

Microsoft recently published an official .NET team blog post reiterating that .NET Framework versions 4, 4.5, and 4.5.1 will no longer receive security updates, support, or hotfixes as of January 12, 2016. This was first announced back in August of 2014, so it’s not like this is new news, but I can say from experience virtually no one is talking about it. MS’ new post talks more about the upgrade path. To sum it up, you need to install .NET Framework 4.5.2, 4.6, or 4.6.1 to be considered supported when it comes to your .NET Framework version. Security is a real issue for many, and those responsible may not know that your version of .NET Framework could be a possible attack vector. Is your security team aware of this impending problem? How will this affect your version matrices (you do have those, right?)?

This is a cross-cutting concern, and I know a majority of database administrators aren’t directly responsible for .Net Framework patches, but work with whoever is responsible and keep them up to date.

Comments closed

Mass Table Renaming

Monica Rathbun needed to rename a table in a lot of scripts.  Here’s how she did it:

The quick answer that I came up with is to script out all of the stored procedures into a single query window.  This can be done easily through the GUI.  Once that is complete, I can easily do a “Find & Replace” on the table name and we’re done!

Ideally, those stored procedures are in source control already and you do your find-and-replace there, but sometimes you can’t control that.

Comments closed

Beware Statistics Corruption

Robert Davis shares a story of statistics corruption causing certain queries to fail:

I suspected that there was some difference between the queries that failed and the ones that were successful in SSMS. It ran the query they gave me, and I got the same error. I got disconnected and no further error info was returned. I also verified that the same query was successful on the otehr two tables mentioned. No errors on the other tables.

I wanted to know what error was causing the connection to be terminated, so I checked the SQL log and discovered that every time it failed, it was generating a stack dump. Before I was done investigating, it had generated 21 stack dumps. The key user-usable error info in the log was:

* Exception Address = 00007FF93BCF7E08 Module(sqlmin+00000000001E7E08)
* Exception Code = c0000005 EXCEPTION_ACCESS_VIOLATION
* Access Violation occurred reading address 0000000000000000

It turns out that CHECKDB & CHECKTABLE do not look at statistics.  If you find yourself in this situation, it’s not a bad idea to see if this is the cause.

Comments closed