To test, I created a new SSIS project named “DeploymentTest1” and added three simple SSIS packages. I right-clicked Package3.dtsx and clicked Deploy Package. The Integration Services Deployment Wizard started, as expected. Since I was deploying to a pristine Catalog, I created a new Catalog Folder named “Deployment”. There were no projects in my catalog, and I was curious how the Deployment Wizard would handle this.
Andy notes that this works, but you might want to stick to the tried-and-true method of deploying entire projects and naming your catalog project the same as your SSDT project.
When you use the Azure portal to create a SQL Database, the various plans under the pricing tier include three service tiers: Basic, Standard, and Premium. Here are those three plans with their high-availability (HA) and disaster recovery (DR) options:
Considering the price point, Microsoft offers some pretty good HA & DR capabilities for Azure SQL Databases.
When a stored procedure is compiled that is accessing an external temp table, SQL Server has no guarantee that the next time this stored procedure is called it is called from the same connection. However, if it is called from a different connection, the accessed temp table might contain significantly more (or less) data making a different execution plan preferable.
A simple way to deal with this situation is to force a recompilation every time a procedure that works with external temporary tables is executed. SQL Server is not going that route. Instead, SQL Server caches the procedure once for each connection. That can safe a significant amount of CPU resources when the procedure in question is called within a loop.
Try to avoid using external temp tables. There are some cases in which it’s a very useful construct, but
Michelle’s code uses INSERT…EXEC to populate a temporary table with the VLF info, and the addition of this extra column breaks the original script. Glenn’s versions of the scripts handle this issue easily since they are version-specific – in the SQL 2012/2014/2016 versions of the script, the temp table declaration is modified to include the extra RecoveryUnitID column, which allows the rest of the script to function as designed.
My problem is I wanted a version of the script that could be used across versions 2005+, and this presented a problem. At first I tried to add an IF…ELSE block to the start of the script to handle the differing CREATE TABLE statements:
This is a good example of working around a problem rather than simply giving up.
OPENstatement is missing from
sys.dm_exec_query_stats. I want to demonstrate that.
There are a few workarounds, and Michael even provides us a handy table, so read the whole thing.
Why would you want to do this? In my case I have a PowerShell function to accept an input object, convert it to a DataSet (a .NET representation of tables and the links between them), and bulk copy the data into SQL Server. However because the input objects can vary slightly with additional tables or additional columns I use the flexibility of SMO to create the tables and columns on the fly.
There’s a lot of power in programmatically defining objects. I’m not sure that’s a good default scenario, but there are times in which it can come in handy.