Press "Enter" to skip to content

Category: Administration

Handling Overly Large Log Files

Kevin Hill shows how to recover from a scenario with an unexpectedly large SQL Server transaction log file:

Step 2: Verify if the log is full or “empty”

Verify if the log file is actually full or not.  If you are backing up and the file still grew to ridiculous size…it may have just been a one time thing and you can deal with that easily.  Right-click the database, go to reports, standard reports, disk usage.  This will give you 2 pie charts.  Left is the data file, right is the log.  If the log shows almost or completely full AND the huge size, you need to backup.  If the log file is huge and mostly empty, you simply need to shrink to an acceptable size.

Great read for a junior-level DBA.

Comments closed

Interrogating A Stack Dump

Kendra Little looks at a SQL Server stack dump:

In the video, I show an example of a stack dump caused by running DBCC PAGE with format style 3 against a table with a filtered index in SQL Server 2014.

It looks like this bug is fixed in SQL Server 2016, at least by SP1.

Sample code to reproduce this against the AdventureWorks2012 database (which I had restored to SQL Server 2014) is in my gist here.

Click through to watch the video.

Comments closed

Google Compute Engine Whitepapers

Brent Ozar Unlimited has a couple whitepapers out about working with SQL Server in Google Compute Engine.  First, Brent and Tara Kizer create an Availability Group:

In this white paper we built with Google, we’ll show you:

  • How to build your first Availability Group in Google Compute Engine

  • How to test your work with four failure simulations

  • How to tell whether your databases will work well in GCE

Erik Darling also has a whitepaper on performance tuning:

Relax. Have a drink. In this white paper we built with Google, we’ll show you:

  • How to measure your current SQL Server using data you’ve already got

  • How to size a SQL Server in Google Compute Engine to perform similarly

  • After migration to GCE, how to measure your server’s bottleneck

  • How to tweak your SQL Server based on the performance metrics you’re seeing

If you’re looking at GCE as a potential migratory spot, you’ve got some extra reading material.

Comments closed

Finding Physical Row Location

Wayne Sheffield shows how to find the physical location of a row in SQL Server:

Acquiring the physical location of a row

SQL Server 2008 introduced a new virtual system column: “%%physloc%%”. “%%physloc%%” returns the file_id, page_id and slot_id information for the current row, in a binary format. Thankfully, SQL Server also includes a couple of functions to split this binary data into a more useful format. Unfortunately, Microsoft has not documented either the column or the functions.

Read on for two functions you can use to format this data more nicely, as well as a short re-write Wayne did to improve performance of one of them.

Comments closed

Using Desired State Configuration To Install SQL Server

Chris Lumnah shows how to use Powershell Desired State Configuration to automate installation and configuration of a new SQL Server instance:

So the installation of SQL Server is now fairly straightforward. The wizard does a nice job of guiding you along the way. 2016 even includes best practice suggestions for tempdb and instance file initialization. Along the way, Microsoft as given us ways to automate the installation of SQL Server. You can sysprep an instance, but this does not really automate the installation. It just helps create a template of an instance. At the end of the day, you still need to do things manually. You can also use a configuration file to assist here. This is a great step forward, but it does not allow for all of the things you need to do to configure a SQL server.

Powershell does. Desired State Configuration (DSC) is functionality built into Powershell that allows for the installation and configuration of a SQL Server.

Chris includes his script as well as a link for more information on DSC in case you aren’t familiar with the concept.

Comments closed

Copying Azure SQL Databases Between Subscriptions

Arun Sirpal shows that it’s pretty easy to copy an Azure SQL Database from one subscription to another:

If you ever need to move a copy of a  SQL database in Azure across servers then here is a quick easy way.

So let’s say you need to take a copy of database called [Rack] within Subscription A that is on server ABCSQL1 and name it database [NewRack] within subscription B on server called RBARSQL1 (The SQL Servers are in totally different data centers too).

Read on for the answer.

Comments closed

Understanding Database Role Permissions

Jason Brimhall shows what happens when you make a user a member of every database role at the same time:

A fundamental component of SQL Server is the security layer. A principle player in security in SQL Server comes via principals. In a previous article, I outlined the different flavors of principals while focusing primarily on the users and logins. You can brush up on that article here. While I touched lightly, in that article, on the concept of roles, I will expound on the roles a bit more here – but primarily in the scope of the effects on user permissions due to membership in various default roles.

Let’s reset back to the driving issue in the introduction. Frequently, I see what I would call a gross misunderstanding of permissions by way of how people assign permissions and role membership within SQL Server. The assignment of role membership does not stop with database roles. Rather it is usually combined with a mis-configuration of the server role memberships as well. This misunderstanding can really be broken down into one of the following errors:

  • The belief that a login cannot access a database unless added specifically to the database.

  • The belief that a login must be added to every database role.

  • The belief that a login must be added to the sysadmin role to access resources in a database.

Worth reading.  Spoilers:  database roles are not like Voltron; they don’t get stronger when you put them all together.

Comments closed

Figuring Out Cost Threshold For Parallelism

Grant Fritchey uses R to help him decide on a good cost threshold for parallelism value:

With the Standard Deviation in hand, and a quick rule of thumb that says 68% of all values are going to be within two standard deviations of the data set, I can determine that a value of 16 on my Cost Threshold for Parallelism is going to cover most cases, and will ensure that only a small percentage of queries go parallel on my system, but that those which do go parallel are actually costly queries, not some that just fall outside the default value of 5.

I’ve made a couple of assumptions that are not completely held up by the data. Using the two, or even three, standard deviations to cover just enough of the data isn’t actually supported in this case because I don’t have a normal distribution of data. In fact, the distribution here is quite heavily skewed to one end of the chart. There’s also no data on the frequency of these calls. You may want to add that into your plans for setting your Cost Threshold.

This is a nice start.  If you’re looking for a more experimental analysis, you could try A/B testing (particularly if you have a good sample workload), where you track whatever pertinent counters you need (e.g., query runtime, whether it went parallel, CPU and disk usage) under different cost threshold regimes and do a comparative analysis.

Comments closed

Standard Edition Max Server Memory Changes In 2016 SP1

Randolph West details the changes in max server memory for Standard Edition in SQL Server 2016 SP1:

The memory limit of 128GB RAM applies only to the buffer pool (the 8KB data pages that are read from disk into memory — in other words, the database itself).

For servers containing more than 128GB of physical RAM, and running SQL Server 2016 with Service Pack 1 or higher, we now have options.

Randolph has a couple good clarifications on memory limits outside the buffer pool, making this worth the read.

Comments closed

Target Server Memory Vs Total Memory

Arun Sirpal compares the Target Server Memory and Total Memory Perfmon counters:

Target Server Memory (KB) is the amount of memory that SQL Server is willing (potential) to allocate to the buffer pool under its current load. Total Server Memory (KB) is what SQL currently has allocated.

Using SQL Server 2014 developer edition (64 bit) my machine has 12GB RAM and maximum server memory is currently set to 8GB and for the purpose of this post I have set  minimum server memory to 1GB (Lock Pages in Memory has not been set).

Read on for a nice description laden with Perfmon pictures.

Comments closed