Press "Enter" to skip to content

Category: Administration

Fixing Power Settings With T-SQL

Randolph West shows how to use T-SQL and xp_cmdshell to switch a server’s power settings from Balanced to High Performance:

Windows has the same setting. It’s in the Power Options under Control Panel, and for all servers, no matter what, it should be set to High Performance.

Here’s a free T-SQL script I wrote that will check for you what the power settings are. We don’t always have desktop access to a server when we are checking diagnostics, but it’s good to know if performance problems can be addressed by a really simple fix that doesn’t require the vehicle to be at rest.

(The script also respects your settings, so if you had xp_cmdshell disabled, it’ll turn it off again when it’s done.)

Click through for the script.

Comments closed

Finding Last DBCC Command Runs

Andrew Kelly has a script to find the last time somebody ran a DBCC command like DBCC FREEPROCCACHE:

Let me explain a few things about the script. I am getting the path of the current trace file and placing it into a variable. The current file name will almost certainly have a suffix of _nn just before the .trc extension.  If I were to run the script as is I would only be reading the current log file and not the other 4 that preceded it. If all you care about is the current log file then fine but most will want to search all the existing log files. One way to do this is to simply replace the current file name with just log.trc and use default as the 2nd parameter as I did above in the fn_trace_gettable function. The default parameter value tells the function to read all files from that one onward. even though log.trc doesn’t actually exist it knows how to handle it and reads all of the existing trace files in order.

So if the string that we search on (here we use ‘dbcc free%’) is in any of the files it will return the matching rows. You may have to adjust the wildcards and such but I think you get the idea. Again remember that the data is transient so always look at the StartTime column in the logs to ensure you know which Date and Time range you are looking at. You can do something like this but I will leave that up to you.

SELECT MIN(StartTime) AS [Begin], MAX(StartTime) AS [End]  FROM ::fn_trace_gettable(@Path,default)

A word of caution in that I never bothered to see just how resource intensive this function is. while I don’t expect any issues with normal use it is not something you want to be searching on every second. Be sensible and you should have no problems.

Click through for more details, including the script Andy uses to do this search.

Comments closed

Using Event Notifications To E-Mail Deadlock Graphs

Dave Mason captures details whenever a deadlock occurs and uses Event Notifications to e-mail them to himself:

As noted, there are other ways to handle deadlocks in SQL Server. The approach presented here may have some drawbacks compared to others. There is an authorization issue for msdb.dbo.sp_send_dbmail that will need to be addressed for logins without elevated permissions. Additionally, you might get hit with an unexpected deluge of emails. (The first time I got deadlock alerts, there were more than 500 of them waiting for me in my Inbox.) Lastly, there’s the XML issue: it’s not everyone’s cup of tea. On the plus side, I really like the proactive nature: an event occurs, I get an email. I think most would agree it’s better to know something (bad) happened before the customers start calling. The automated generation of Deadlock Graph (*.xdl) files is convenient. And event notifications have been available since SQL Server 2005. As far as I know, the feature is available in all editions, including Express Edition.

Click through for all of the code Dave used to set this up.

Comments closed

Winnowing Down WhoIsActive Data

Kendra Little shows how to use temporary objects to pare down the results of sp_whoisactive for later storage:

I used the @schema parameter to have sp_WhoIsActive generate the schema for the table itself. Full instructions on doing this by Adam are here.

Since I care about tempdb in the case of this example, I used @output_column_list to specify that those columns should come first, followed by the rest of the columns.

I also elected to set @get_plans to 1 to get query execution plans if they’re available. That’s not free, and they can take up a lot of room, but they can contain a lot of helpful info.

This is a very useful guide, and also read the linked documentation for sp_whoisactive; there’s a huge amount of goodness in that one procedure.

Comments closed

Indirect Checkpoint And Non-Yielding Scheduler Problems

Parikshit Savjani has a post describing an issue you might experience with indirect checkpoint post SQL Server-2012:

One of the scenarios where skewed distribution of dirty pages in the DPList is common is tempdb. Starting SQL Server 2016, indirect checkpoint is turned ON by default with target_recovery_time set to 60 for model database. Since tempdb database is derived from model during startup, it inherits this property from model database and has indirect checkpoint enabled by default. As a result of the skewed DPList distribution in tempdb, depending on the workload, you may experience excessive spinlock contention and exponential backoffs on DPList on tempdb. In scenarios when the DPList has grown very long, the recovery writer may produce a non-yielding scheduler dump as it iterates through the long list (20k-30k) and tries to acquire spinlock and waits with exponential backoff if spinlock is taken by multiple IOC routines for removal of pages.

This is worth taking a close read.

Comments closed

Bacpacing In Azure

Derik Hammer shows how to use a bacpac file to deploy an existing database to Azure SQL Database:

The recommended method for working with Azure is always PowerShell. The Azure portal and SSMS are tools there for your convenience but they do not scale well. If you have multiple databases to migrate, potentially from multiple servers, using PowerShell will be much more efficient. Scripting your Azure work makes it repeatable and works towards the Infrastructure as Code concept.

In this demonstration, the below steps will be used.

  1. Export the bacpac file to a local directory with sqlpackage.exe.

  2. Copy the bacpac to Azure Blob Storage with AzCopy.exe

  3. Use the PowerShell AzureRM module and cmdlets to create an Azure SQL Database from the bacpac file.

Derik shows the point-and-click way as well as the Powershell way.

Comments closed

Self-Analysis Of SQL Server Dump Files

Arun Sirpal walks through the SQL Server Diagnostics preview:

Notice the region to upload – If you are using a work machine I would suggest getting authorisation. The great thing here is that this is GDPR compliant.

Once ready hit the upload button, it goes through 3 phases. Upload, Analysis and a recommendation.

It sends your dump files to an external service, which is important enough to point out.  If you want more details on the product, Rony Chatterjee has a FAQ.

Comments closed

PoshBot

Warren Frame shows off ChatOps with PoshBot:

We’re going to cover the basics to get up and running with PoshBot:

  • Create a Slack bot
  • Create a PoshBot configuration
  • Run PoshBot as a service
  • Write a PoshBot plugin
  • Use PoshBot

This might seem like a lot of work, but the configuration and service are a one time thing – Writing plugins is just like writing PowerShell functions and modules!

One of my mad scientist co-workers has put together a similar bot and it tells our DBA team how servers are doing.  It’s quite useful for system reconnaissance, particularly when all you have is a phone and a trouble ticket.

Comments closed

Wiggle Room Files

David Klee shows one tactic for running out of disk space, in this case on an ESXi host:

Sometimes this task is harder than it sounds. If your SAN is out of space, or the SAN management tools are out of your control, you could be stuck.

But… follow a simple trick to give yourself that last little bit of wiggle room in the event that a snapshot fills a datastore.

Add a large text file to the root of the datastore that you can delete if you need headroom! I know it sounds too simple… but it’s simple and effective.

Filed under “old but good.”

Comments closed

Performance Comparison: Comparing Column Differences

Shane O’Neill has a column difference showdown:

The original post for this topic garnered the attention of a commenter who pointed out that the same result could be gathered using a couple of UNION ALLs and those lovely set-based EXCEPT and INTERSECT keywords.

I personally think that both options work and whatever you feel comfortable with, use that.

It did play on my mind though of what the performance differences would be…what would the difference in STATISTICS IO, TIME be? What would the difference in Execution Plans be? Would there even be any difference between the two or are they the same thing? How come it’s always the things I tell myself not to forget that I end up forgetting?

This may not be the most important thing to test, but it does show you a technique.

Comments closed