In my previous post I showed an elegant way to extract case sensitive data from an XML, but what if we’d like to use that method to extract all case sensitive data?
Here is how we can do it.
Read on to see how.
Leave a CommentA Fine Slice Of SQL Server
In my previous post I showed an elegant way to extract case sensitive data from an XML, but what if we’d like to use that method to extract all case sensitive data?
Here is how we can do it.
Read on to see how.
Leave a CommentIn this video, I show how we can use the APPLY operator to reshape datasets, allowing us to unpivot tables and also calculate the greatest and least values for a row.
If you look closely at the scripts, you’ll see 08 and 10. In the source control repo, I also have a script 09 that covers splitting strings. Using APPLY to split strings has always been a bit of a niche case, but prior to SQL Server 2016’s introduction of STRING_SPLIT()
and SQL Server 2022’s improvement of the function, I could make the case that it sometimes made sense to know how to split strings via APPLY
. Today, not so much, which is why I tossed that demo from the video.
In this video, I show how we can use the APPLY operator to remove redundancy in the SELECT clause and simplify complex calculations, all with zero performance impact.
This is, as I’ve said in the past, my favorite use case for the APPLY operator. As I’ve become older and (even) more crochety, I’ve sided more and more with “easy to read” versus “runs faster” for code. And when you get “easy to read” with no impact on “runs faster,” I’m all in.
The accounting scenario I show may be a fairly extreme case, but I’d bet that queries similar to this abound in almost any company. A big part of why complex reporting queries are so complex comes from repetition of calculations.
Leave a CommentErik Darling talks about one of my favorite T-SQL features.
As usual, Erik leaves me hanging with respect to the lack of description or snippet I can use as a graf for enticing my wonderful audience to watch his video. Thus, I have to come up with my own. Erik’s video is actually a really good companion piece to my video that also dropped this week, as we both cover the same general concept.
Leave a CommentIn this video, I show how we can use the APPLY operator to find records associated with a “parent” table. We look at getting the latest record, as well as any arbitrary number of records. We also spend time digging into under what circumstances the APPLY operator out-performs other common options.
During the outro segment, I dig into under what circumstances APPLY is the best tool for the job, so stick around for the end.
1 CommentLouis Davidson gives it the college try:
When I was perusing my LinkedIn feed the other day, I came across this thread about using SELECT *. In one of the replies, Aaron Cutshall noted that: “Another real performance killer is
SELECT DISTINCT
especially when combined withUNION
. I have a whole list of commonly used hidden performance killers!”To which started my brain thinking… What does happen when you use these together? And when you use UNION on a set with non-distinct rows, what happens. So for the next few hours I started writing.
Read on for Louis’s findings.
Comments closedAnthony Nocentino cuts out the middleman:
In this post, I’ll walk you through a T-SQL script that creates application-consistent snapshots on Pure Storage FlashArray, all from within SQL Server, no external tooling. SQL Server 2025 introduces a powerful new feature: the
sp_invoke_external_rest_endpoint
stored procedure. This enhancement makes calling REST APIs directly from T-SQL easier than ever. Combining this new capability with Pure Storage’s API allows us to orchestrate snapshot operations seamlessly, with no external tools or scripts required.
Click through for the process. I know that sp_invoke_external_rest_endpoint
will be controversial for DBAs. That’s why I think it’s good to have examples of how it can be useful before the knee-jerk reaction of “this is automatically bad” takes over.
In this video, I show how we can use the APPLY operator to operate on ad hoc functions. That leads to a powerful use case: pre-aggregating data.
Every once in a while, this tip will save a considerable amount of CPU time and database effort.
Comments closedJeffry Schwartz looks for exceptions:
Given the complexity, I realized that validating all intermediate and final result sets was essential to ensure that tuning changes did not alter any report results. To support this validation, I saved interim and final result sets into tables for direct comparison.
For these comparisons, the EXCEPT and INTERSECT operators proved invaluable.
Click through for the full story. I’ve always liked using these set operations in ETL jobs because they automatically know how to handle NULL
, so this approach is more robust than rigging your own comparisons.
Dennes Torres tries out the JSON data type in Azure SQL Database:
Before this new field type, JSON data was typically stored in
varchar(max)
columns. There are many features to use with JSON values stored invarchar(max)
columns and variables, but storing JSON as regular strings is still limited.The built-in JSON type expands the possibilities. Using an actual JSON column, it becomes easier to build constraints related to JSON columns, for example.
Dennes also spends a lot of the article covering the JSON_ARRAYAGG()
and JSON_OBJECTAGG()
functions.