Press "Enter" to skip to content

Month: July 2017

TempDB In Azure IaaS

Jim Donahoe shows how to use the temporary SSD on an Azure VM for SQL Server’s tempdb:

Remember, this disk is as the title of this section says…TEMPORARY! Do NOT put ANYTHING on this drive you cannot afford to lose. Don’t say nobody warned you either, because the drive itself contains a nice little txt file warning you, here is the EXACT text:

WARNING : THIS IS A TEMPORARY DISK.

Any data stored on this drive is SUBJECT TO LOSS and THERE IS NO WAY TO RECOVER IT.

Please do not use this disk for storing any personal or application data.

It’s good to see what you need to do to get this working.  I’ve found it just to be easier to set up a permanent SSD, but if you’re on a tight budget, this can save you some cash.

Comments closed

Downgrading SQL Server

Jonathan Kehayias is going the opposite direction of the Jeffersons:

At some point in your career working with SQL Server, you will run into a situation where the wrong edition of SQL Server has been installed on a server and will need to change the edition for licensing reasons.  Whether it is Enterprise Edition where Standard Edition should have been installed, Enterprise Edition where Developer Edition should have been used, or my favorite, Evaluation Edition where the 180 day trial has expired and Enterprise Edition isn’t going to be used, the only route available for downgrading the edition is to uninstall and reinstall SQL Server entirely.  SQL Server Setup makes upgrading editions a piece of cake with SKUUPGRADE as a command line option for going from Standard/Developer/Evaluation to Enterprise, but anything else requires a full uninstall and reinstall to change the SKU/Edition and then restore all of the system and user databases to the new instance, which typically means a lot of work.  I hate having to restore system databases and avoid having to do it if possible, so here is how I do this process and minimize the work required:

No matter what you are going to have to do an uninstall and reinstall of the SQL Server instance to downgrade the SKU.  However, you can save yourself some time and the headache of trying to restore the system databases if you are careful about what you do.  I have done a plenty of SKU downgrades in the past and the easiest way to do it, and I am not saying this is the Microsoft supported way but that it works if done correctly, is to:

Jonathan has an 11-point checklist that’s well worth checking out.  Though hopefully, may all your editions be Enterprise…

Comments closed

R Model Compression

I have a post showing off some of the value of compressing R models:

So right now, we’re burning roughly 200K per model.  My stated goal is to be able to store several years worth of data for 10 million products.  Let’s say that I need 10 million products in ProductModel and 1 billion rows in ProductModelHistory.  That means that we’d end up with 1.86 TB of data in the ProductModel table and 186 TB in ProductModelHistory.  This seems…excessive.

As a result, I decided to try using the COMPRESS() function in SQL Server 2016.  The COMPRESS function simply uses GZip compression.  Yeah, there are compression algorithms which tend to be more compact (e.g., bz2 or 7z), but GZip is relatively CPU efficient and I can wrap my SQL statements with COMPRESS() and DECOMPRESS() and not have to change any calling code; I just need to update the two stored procedures I use to insert and then retrieve product models.

Most of the time, it’s not a big deal.  But once you start talking hundreds of gigabytes or in my case, a couple hundred terabytes, it’s definitely worth compressing this data.

Comments closed

Disk Usage By Table Report

Kenneth Fisher shows off my favorite built-in SSMS report:

Every now and again you need to know how big a table is. Or several tables. Or all of the tables. Number of rows is frequently handy when you’re going to create a new index or otherwise modify the table. The amount of space used by the indexes can be helpful in deciding how much space you need to do a re-index. The tables with the most unused space is nice to know if you have a problem with ever growing heaps.

In the past my go to solution here was sp_spaceused. It’s a really handy procedure.

USE AdventureWorks2014;
GO
EXEC sp_spaceused 'Person.Person';
GO

Great information but it has a few problems. You can only run it for one table at a time (sp_msforeachtable is a workaround, if undocumented), the file sizes aren’t consistent (sometimes KB, sometimes MB or even GB), and it only returns the name of the object but not the schema. So if there is the same table name under multiple schemas it can get tricky.

Read on for how to access and use this report.

1 Comment

Assigning Tags In Azure

Melissa Coates shows how to assign tags to resources in Azure using Powershell:

You can assign tags for resource groups, as well as individual resources which support Azure Resource Manager. The individual resources do not automatically inherit tags from the resource group parent. A maximum of 15 key/value pairs can be assigned (though you could store concatenated values or embedded JSON in a single tag value as a workaround). You may want to just assign tags at just the resource group level, and use custom queries to “inherit” at the resource level. Alternatively, you may want to assign tags to the individual resources directly particularly if you want to see them clearly on the standard “download usage” report of billing.

Since the key/value pairs are just free-form text, watch out for uniformity issues. To improve consistency, you can utilize policies to require tags and/or apply defaults if you’d like (for example, you might want to enforce a “Created By” tag). Tags can be set in the ARM template when you initially deploy a resource (which is best so that no billing occurs without proper tagging), or afterwards to existing resources via the portal, PowerShell, or CLI.

Melissa also shows how to query those tags using Powershell.

Comments closed

Understanding Azure Billing

Denny Cherry explains some of the oddities behind Azure billing:

Some things stand out to me when looking at this bill. First is the service that is being billed. It doesn’t say how many DWs we’re being billed more (more on that later).

Something else odd is that the SQL DW shows being billed for 1220 hours. Now in theory the most number of hours for any single service you can be billed for in a single month is 744 as that’s 24 (hours) *31 (days), but here we’re being billed for 1220 hours. Now we know for a fact that this SQL DW is powered off at times, so how are we using this resource for more than 31 days in a month?

The last oddity is the rate. $1.46 an hour. This points us to the reason for all of these strange numbers. That’s the rate to run a SQL DW at DW100 for one hour.

Click through for Denny’s explanation.

Comments closed

When CHECKDB Doesn’t Catch Corruption

Paul Randal explains one condition when DBCC CHECKDB misses corruption:

An interesting situation was discussed online recently which prompted me to write this post. A fellow MVP was seeing periodic corruption messages in his error log, but DBCC CHECKDB on all databases didn’t find any corruptions. A subsequent restart of the instance caused the problem to go away.

My diagnosis? Memory corruption. Something had corrupted a page in memory – maybe it was bad memory chips or a memory scribbler (something that writes into SQL Server’s buffer pool, like a poorly-written extended stored procedure), or maybe a SQL Server bug. Whatever it was, restarting the instance wiped the buffer pool clean, removing the corrupt page.

So why didn’t DBCC CHECKDB encounter the corrupt page?

Read on for the rest of the story.

Comments closed

Dealing With Newlines In Reports

Shane O’Neill shows how to deal with newlines in data:

have mentioned before that we can use CHAR(10) and CHAR(13) for new lines and carriage returns in SQL Server so I’ll leave it up to an exercise to the reader to create a table with these “troublesome” bits of information in them (plus if you came here from Google, I assume you already have a table with them in it).

For me, I’ve just created a single table dbo.NewLineNotes that has a single entry with a new line in it.

Read on for more.

Comments closed

2016 Best Practices By Default

Wayne Sheffield shows how SQL Server 2016 helps you follow best practices:

tempdb

Ahh, tempdb. In SQL Server, that database does so much… it’s where temporary tables and table variables are stored. It’s where row version activity is stored. It’s where running queries will spill out temporary workspace needed for the query. And on, and on, and on. On busy systems, having this database running optimally is of vital importance. And there are several best practices that need to be observed here.

Read on for several tips.

Comments closed

Powershell Pattern Matching

Klaas Vandenberghe shows some of the pattern matching and regular epxression functionality within Powershell:

This is like grep if you ever encountered that. The Select-String cmdlet finds a -pattern in text.
It has an -allmatches switch to … guess what 🙂

Get-ChildItem D:\Myscripts -File -Recurse *.sql | Select-String -pattern "drop\s+[?(database|table|login)"

will get us all our sql scripts in which we drop a database, table or login.
The returned MatchInfo object holds the matching parts of the script text, and also the name of the file and the line number where the match was found.

Get-ChildItem | Select-String is my most frequently used Powershell pipeline.

Comments closed