Press "Enter" to skip to content

Category: T-SQL

The Joy Of OUTPUT

Mark Wilkinson shows off some fun stuff you can do with the OUTPUT clause:

A common command in the Linux world is the tee command. What tee allows you to do is pipe the output of a command to a file as well as the console. This same functionality can be implemented using multiple OUTPUT clauses in a T-SQL statement. In this example we are going to update a few hundred records. When the update statement is run, not only will it update the MyGuid table but it will update a log table and also return the result of the update. This is accomplished by using two OUTPUT clauses.

Check it out.  I don’t use OUTPUT that often, but it can be quite useful when in a pinch or if you want to prevent scanning a table twice.

Comments closed

Filtering Alert Noise With A Leaky Bucket Algorithm

I have a post implementing a leaky bucket algorithm in T-SQL:

Now that we have a table, we want to do something with it.  The most naive solution would be to fire off an alert every time a row gets added to this table.  The problem with this solution is that it can flood our inbox with useless information.  Suppose the developers push out an API change that breaks everything.  The first 500 response will be important information.  The second one might be important because it’s confirmatory.  The six hundredth alert will not help.  And heaven help you if you’ve got this alert tied to your PagerDuty account…

So let’s only alert if we get to a certain threshold—we’ll call it 5 messages.  Fewer than 5 and we can assume it’s a blip or just somebody doing something weird.  The next part of the problem is, 5 messages per what?  We don’t want to alert for every fifth error message.  Let’s say we get one error message a year, so there was one in 2014, one in 2015, one in 2016, one in 2017, and now one in 2018.  If we simply set our threshold to report on every fifth error, we’ll get an alert in 2018, but most likely, the prior years’ errors are long gone by now, so that’s not helpful.

Read on for the solution.  I’ve been quite happy with the solution in practice, as it has cut down the number of spurious alert e-mails to practically nil.

Comments closed

Generating Passwords In T-SQL And Powershell

Dave Mason shares a couple methods for generating good passwords:

There’s really nothing special there. On line 7 I specify how many characters long I want the password to be. I can run the code as many times as needed, or put it in a UDF or a loop if I want to get fancy.

Recently, though, I had the need to generate passwords outside of a T-SQL environment. I immediately went looking in the .NET Framework, and none to my surprise, I found something: the function System.Web.Security.Membership.GeneratePassword(). I did some initial testing in C#, then decided to proceed with a PowerShell scrip

Click through for the scripts.

Comments closed

Using DATEADD Instead Of DATEDIFF

Michael J. Swart points out a bit of trickery with DATEDIFF:

I assumed that the DATEDIFF function I wrote worked this way: Subtract the two dates to get a timespan value and then return the number of seconds (rounded somehow) in that timespan.

But that’s not how it works. The docs for DATEDIFF say:

“Returns the count (signed integer) of the specified datepart boundaries crossed between the specified startdate and enddate.”

There’s no rounding involved. It just counts the ticks on the clock that are heard during a given timespan.

Read the whole thing.

Comments closed

Creating Dynamic Pivot Tables

Ben Richardson shows how to use dynamic SQL to create pivot tables with arbitrary numbers of pivot elements:

The headings of the columns are the individual values inside the city column. We specified these values inside the pivot operator in our query.

The most tedious part of creating pivot tables is specifying the values for the column headings manually. This is the part that is prone to most errors, particularly if the data in your online data source changes. We can not be sure that the values we specified in the pivot operator will remain in the database until we create this pivot table next time.

For instance, in our script, we specified London, Liverpool, Leeds and Manchester as values for headings of our pivot table. These values existed in the Сity column of the student table. What if somehow one or more of these values are deleted or updated? In such cases, null will be returned.

A better approach would be to create a dynamic query that will return a full set of values from the column from which you are trying to generate your pivot table.

Click through to see how to build this.

Comments closed

Matrix Transposition In T-SQL

Phil Factor has some fun transposing a matrix using T-SQL:

What I’m doing is simply converting the table into its JSON form, and then using this to create a table using the multi-row VALUES  syntax which paradoxically allows expressions. The expression I’m using is JSON_Value, which allows me do effectively dictate the source within the table, via that JSON Path expression, and the destination. As it is an expression, I can do all sorts of manipulation as well as a transpose.  I could, if I wanted, (in SQL 2017)provide that path parameter as a variable. This sort of technique can be used for several other reporting purposes, and it is well-worth experimenting with it because it is so versatile.

That is not at all what I would have thought up; very interesting approach.  I’d probably just be lazy and shell out to R Services.

Comments closed

Joins And Parentheses

Shane O’Neill walks through different ways of grouping tables in a SQL query:

Asker: that’d be awesome if i can inner join two other tables instead of the table mentioned after FROM keyword
Me: …wait, what?
A: He’s asking
t1 left join t12
t1 left join t13
t12 inner join t13
M: em…it’s possible but it’s…iffy
A:  i wanna learn it.
do your magic

I’ve seen this in action before, but rewrote the queries not to do this.  The problem is that as the query gets more complicated, it becomes much harder to diagram things mentally.  I don’t think I’ve seen a use yet that I couldn’t rewrite to be simpler.

Comments closed

Casting And Conversion Defaults

Greg Low is a bit disappointed with TRY_CAST and TRY_CONVERT:

Surprised? I’d have to say that I was. Now as my buddy Adam Machanicpointed out, it’s not the fault of TRY_CAST and TRY_CONVERT because they just TRY to do a CAST and a CONVERT. And it’s the original functions that have the bizarre behavior.

Can’t say that I love this because it means that I can’t use these functions for their purpose, except for decimal. So that then left me wondering which types had this behavior.

Check it out.  One way to get around this default behavior could be to use NULLIF, so TRY_CAST(NULLIF(@InputVar, ”) AS INT).

Comments closed

Updating Statistics On System Tables

Dan Guzman shows that you can update statistics on system tables:

The solution in many cases to simply update statistics on the underlying system tables indexes used by the problem DMV queries. This can be done selectively by identifying the system table indexes referenced in execution plan seek and scan operators of the problem query execution plan and then executing UPDATE STATISTICS on each index. However, the task is somewhat tedious.

Alternatively, one can simply update stats on all the system tables. Below is a script that generates and executes DDL to update stats on all non-empty system tables, making quick work of this.

Click through for the script, as well as an interesting note if you try to use constructs like @sql = @sql + N’some string’ in your code.

Comments closed

Don’t Forget NOCOUNT

Lonny Niederstadt shows just how expensive printing out result counts can be:

OK.  Now Aaron Bertrand has a post from February 2016…
Performance Surprises and Assumptions : SET NOCOUNT ON
https://sqlperformance.com/2016/02/t-sql-queries/nocount

In that blog post the potential performance benefit of NOCOUNT ON was elusive.  And, to be honest, this round of my tests was NOT to learn about NOCOUNT but other stuff.  Just happened to stumble on NOCOUNT when I was sifting through stuff after some early testing.  But with these results in hand, maybe NOCOUNT has a few more performance surprise for all of us 🙂

For a single run, printing out counts isn’t that expensive, but when doing a lot of work, it can add up.

Comments closed