Press "Enter" to skip to content

Category: T-SQL Tuesday

Community Tooling for SQL Server

John McCormack reviews several community tools for SQL Server:

Statistics Parser is a web page which allows you to paste in the statistics time,io output from the SSMS messages tab, and it formats into neat tables showing how much IO happens for each table. It is immediately easier to read and you get a handy % column on the right hand side showing you which tables are being read the most. I find this really useful for query tuning because it lets me know where my biggest pain points are. For complex queries, which touch a lot of tables, it just makes it easy to see at a glance where you should initially focus your attention. It also shows worktable and workfile tables which serves as a handy hint that tempdb is in play.

John lists a half-dozen but really drives in on Statistics Parser.

Comments closed

T-SQL Tuesday 134 Roundup

James McGillivray summarizes the results from T-SQL Tuesday #134:

When I volunteered to host a T-SQL Tuesday, I had a very different topic in mind. However, the incredible events of the last year, and in particular, the immense pressure that my wife faced at work, made me realise how important it is to have ways to take breaks, both mental and physical. And while we were away in December, and we both recharged, I thought it would make a good topic for this event. It was wonderful to see the response from the #sqlfamily to my invitation, and by my count 29 different people contributed to the blog party.

I’ve tried to group posts with similar themes in this summary, and since some posts fall into multiple categories, I may mention a single post more than once. Links on names point to Twitter handles, links on descriptions point to the respective blog posts.

Click through for a rather large roundup.

Comments closed

T-SQL Tuesday 130 Roundup: Automate Your Stress Away

Elizabeth Noble recaps T-SQL Tuesday #130:

I’m really grateful for all the bloggers that took part this month. Especially since automation has been a topic discussed before. However, it’s hard for me to get too much of my day to day work automated. And I was really looking forward to these topics so that I could learn new tasks I could automate myself. With that said, let’s see all the wonderful ideas people contributed this month. And if you’re like me, you’re going to want to put some of this automation in place as soon as possible.

Click through for 24 entries.

Comments closed

GetAllTheErrorLogs: Combining Multiple Log Sources

Aaron Bertrand has a new project:

There’s a whole lot of grunt work in there that shouldn’t have to be done by a person. I don’t think you could automate the whole thing, because it is hard to predict exactly what events will be interesting and not, but I think 90% is achievable.

A colleague mentioned that they want to build something that would help, but even when that happens, that would up in proprietary code that only helps us. I saw Drew Furgiuele’s post on Building a Better Get-SQLErrorLog, and that gave me some ideas for what I would build. After reaching out to Drew, I created a GitHub repository with a working name of GetAllTheErrorLogs. Its elevator pitch is a simple sentence:

Powershell to assemble a timeline combining salient events from Windows Event Log, Failover Cluster log, and SQL Server errorlog.

Click through for the details as well as Aaron’s current progress.

Comments closed

Automating Database Restorations

Hugo Kornelis walks us through automated restoration of database backups:

Now I’ve been to quite a few conferences. And I’ve heard a lot of DBAs talk about best practices. One of the constants in those talks is: automate your restores. So I felt confident that a quick internet search would surely be enough to find me an existing script for restoring a database. Sure, I’d need to modify it to restore to a test database, but that should be minimal effort.

To my surprise, I was unable to find a script. Is scripting this so easy that every DBA can do it with their eyes closed, and nobody feels a need to share it? Is it so hard that they all decided they’re sitting on gold and will only share it for big money? Or were my internet search skills simply severely lacking?

Anyway, bottom line is that I had to do it myself. And I’ll share the result in this post.

Click through for the script and a detailed explanation of how it works.

Comments closed

Automation Tips for ETL

Richard Swinbank shares some tips around automating ETL processes:

I spent a good part of my career in the UK National Health Service (NHS). We often talk about data engineering problems in terms of handling large volumes of data, but the challenge in the NHS was frequently complexity – even a small acute hospital might have a dozen or more separate systems, running on various different DBMS backends, with many tables of interest for management or central reporting.

You might need to extract data from hundreds of (mostly small) tables to populate your data warehouse – my challenge was to make this as quick and easy to implement as possible.

Click through for the tips.

Comments closed

Automate sp_whoisactive Runs

John McCormack shows how to save the output from sp_whoisactive for later research:

When I took over a fairly troublesome SQL Server, I was always being asked what was running when the SQL Server was slow. That was fine when it was during the day and I could look. But often I would be asked, why was the server slow at 9pm last Friday? Or some other time. It’s not always the easiest thing to do as SQL Server doesn’t natively keep this information for you.

A quick but effective way to automate this is to download and schedule Adam Machanic’s sp_whoisactive stored procedure. I run this every 10 minutes but I have seen it scheduled as often as every 30-60 seconds. It depends on your system and your needs.

This is in place where I work, and I’ve put it into place when consulting for companies without the budget for fancy tools. I really appreciate that Adam Machanic made it a simple option to insert results into a table. Also, John has a step to delete older data, which is critical.

Comments closed

Automating dbachecks Runs

Jess Pomfret automates runs of dbachecks via scheduled task:

To automate the running of our daily checks we’ll first create a PowerShell script and then schedule that using task scheduler.  If you have other enterprise scheduling tools available you could easily use those instead to invoke the PowerShell script.

The script for my example, shown below, is pretty simple. I have a section to define where the data will be stored (the ability to save dbachecks result information straight into a database was introduced with dbachecks 2.0 and so I would highly recommend updating if you’re on an earlier version).

Click through for the full explanation and code.

Comments closed

Automating a Permissions Audit with dbatools

Garry Bargsley walks us through the process of a permissions audit made easier:

My post for T-SQL Tuesday #130 is going to be how I automated my companies SQL Server Permissions Audit process. Even though this only occurs every quarter it is something that I as the DBA never look forward to and the business partners never really enjoy. In an effort to streamline the process from start to finish I built automation using PowerShell, dbatools and ImportExcel. The process now builds the permissions audit file, extracts individual users from Active Directory and e-mails the information to the appropriate team(s) for review.

Read on for details about the process.

Comments closed