This has come up a few times recently, I find it rather fascinating and I can never seem to remember how to do it properly .. so in other words, it’s a perfect subject for a blog post.
Basically, you can use VALUES to create a table within a query. I’ve seen it done in a number of places. Mostly when demoing something and you want some data, but don’t want to actually create a table. I’ve also seen it used to create a numbers table, create test data, etc. Really, any case where you want a short list of values but don’t want to create an actual (or even temp) table to store them.
Click through for examples on how to construct and use this virtual table as a quick replacement for creating a temporary table or table variable.
Like other changes to TempDB, in order for the new memory-optimization to take effect a restart of the SQL Server service is required. Once the service is restarted, system tables in TempDB are now memory-optimized (it should be that way for RTM, but in CTP 3.0, it could be the case that not all system tables have been converted to Hekaton). You can reverse this setting with the following command, and again restarting the SQL Server service:
There are a couple of important notes that Ned gives us around accessing these metadata tables in scripts.
A while back I learned that it’s possible to create temporary stored procedures in SQL Server.
I never put that knowledge into practice however because I struggled to think of a good use case for when a temporary stored procedure would be preferable to a permanent stored procedure.
Not long ago I encountered a scenario where using a temporary stored procedure was the perfect solution to my problem.
Those scenarios are rare but Bert did hit one of them.
TL;DR – Update to the latest CU, create multiple tempdb files, if you’re on SQL 2014 or earlier enable TF 1117 and 1118, if you’re on SQL 2016 enable TF 3427.
And now it’s time for everyone’s favorite SQL Server topic – tempdb! In this article, I’d like to cover some recent changes that you may not be aware of that can help alleviate some common performance issues for systems that have a very heavy tempdb workload. We’re going to cover three different scenarios here:
There’s some good information in here so don’t just say tl;dr.
That’s what it looks like when your system is under heavy contention due to a lot of queries partying in table variables: in each second, each core on this system is spending 48 seconds waiting on PAGELATCH_UP. That’s awesome.
What about temp tables instead, you ask? Same symptoms: PAGELATCH_UP.
Read the whole thing.
Inserts and other modifications to table variables can’t be parallelized. This is a product limitation, and the XML warns us about it.
The select could go parallel if the cardinality estimate were more accurate. This could potentially be addressed with a recompile hint, or with Trace Flag 2453.
Click through to see an example of what Erik means.
When people think about Heaps and the problems they can cause, they don’t often think of temp tables and table variables as Heaps.
Of course, without a clustered index, any table is a Heap.
This isn’t an argument for or against indexing temp tables, but while working with a client we came across something strange!
sp_BlitzFirst was reporting hundreds of thousands of Forwarded Records, but there were no Heaps in user databases.
When we dug in closer to what queries were doing, we found lots of places where temp tables had an insert/update pattern.
Click through for a demonstration and an explanation of why this can be trouble.
While temp tables are a good option for in-flight data transformation, there are some unique challenges that arise when using temp tables in SSIS.
SQL Server Integration Services uses tight metadata binding for data flow operations. This means that when you connect to a relational database, flat file, or other structure in an SSIS data flow, the SSIS design-time and runtime tools will check those data connections to validate that they exist and that the metadata has not changed. This tight binding is by design, to avoid potential runtime issues arising from unexpected changes to the source or destination metadata.
Because of this metadata validation process, temp tables present a challenge to the SSIS data flow. Since temp tables exist only for the duration of the session(s) using them, it is likely that one of these tables created in a previous step in an SSIS package may not be present when validation needs to occur. During the design of the package (or even worse, when you execute the deployed package in a scheduled process), you could find yourself staring at an “object not found” error message.
It’s good to have alternatives, though there are times when you really just need a temp table.
You should be able to create a #temp in every session. That’s the idea, right? It’s one of the things that differentiates a global temp table from a local temp table. But there can be some difficulties with that.
If you are working with reusable code that uses temp tables (a stored procedure for example), sometimes you need to create a constraint. The thing about constraints is that their names are just as unique as tables, stored procedures etc. i.e. the name of a constraint can only be used once. You can’t have two tables with the same constraint name. In fact, you can’t even have a constraint name that matches a table, stored procedure etc name.
There’s some solid advice in this post.
I was recently reviewing a newly created T-SQL stored procedure. This procedure was verifying temporary table existence with the following code:
2 IF exists (select * from tempdb..sysobjects where name like ‘#fg%’)DROP TABLE #fg
Seeing this takes me back to one of my favorite presentations, where I compare Temporary Tables and Table Variables. In this, I go over several of the methods that I have seen for how code found on the internet actually does this task… and I show why they are all doing it wrong.
Read on to understand why this isn’t the correct answer.