Press "Enter" to skip to content

Month: January 2016

Single-Package Deployments In SSIS 2016

Andy Leonard looks at what happens when your first SSIS project deployment is a single-package deployment:

To test, I created a new SSIS project named “DeploymentTest1” and added three simple SSIS packages. I right-clicked Package3.dtsx and clicked Deploy Package. The Integration Services Deployment Wizard started, as expected. Since I was deploying to a pristine Catalog, I created a new Catalog Folder named “Deployment”. There were no projects in my catalog, and I was curious how the Deployment Wizard would handle this.

Andy notes that this works, but you might want to stick to the tried-and-true method of deploying entire projects and naming your catalog project the same as your SSDT project.

Comments closed

HA/DR With Azure SQL Database

James Serra looks into high availability and disaster recovery options for Azure SQL Database:

When you use the Azure portal to create a SQL Database, the various plans under the pricing tier include three service tiers: Basic, Standard, and Premium.  Here are those three plans with their high-availability (HA) and disaster recovery (DR) options:

Considering the price point, Microsoft offers some pretty good HA & DR capabilities for Azure SQL Databases.

Comments closed

External Temp Tables And Plan Cache Bloat

Sebastian Meine warns us about using external temporary tables in stored procedures:

When a stored procedure is compiled that is accessing an external temp table, SQL Server has no guarantee that the next time this stored procedure is called it is called from the same connection. However, if it is called from a different connection, the accessed temp table might contain significantly more (or less) data making a different execution plan preferable.

A simple way to deal with this situation is to force a recompilation every time a procedure that works with external temporary tables is executed. SQL Server is not going that route. Instead, SQL Server caches the procedure once for each connection. That can safe a significant amount of CPU resources when the procedure in question is called within a loop.

Try to avoid using external temp tables.  There are some cases in which it’s a very useful construct, but

Comments closed

VLFs And Temp Tables

Andy Galbraith was trying to update a script which counts VLFs and ran into problems defining his temp table:

Michelle’s code uses INSERT…EXEC to populate a temporary table with the VLF info, and the addition of this extra column breaks the original script.  Glenn’s versions of the scripts handle this issue easily since they are version-specific – in the SQL 2012/2014/2016 versions of the script, the temp table declaration is modified to include the extra RecoveryUnitID column, which allows the rest of the script to function as designed.

My problem is I wanted a version of the script that could be used across versions 2005+, and this presented a problem.  At first I tried to add an IF…ELSE block to the start of the script to handle the differing CREATE TABLE statements:

This is a good example of working around a problem rather than simply giving up.

Comments closed

Temporal Tables Via SMO

Cody Konior shows how to create temporal tables in SQL Server 2016 using Powershell and Server Management Objects:

Why would you want to do this? In my case I have a PowerShell function to accept an input object, convert it to a DataSet (a .NET representation of tables and the links between them), and bulk copy the data into SQL Server. However because the input objects can vary slightly with additional tables or additional columns I use the flexibility of SMO to create the tables and columns on the fly.

There’s a lot of power in programmatically defining objects.  I’m not sure that’s a good default scenario, but there are times in which it can come in handy.

Comments closed

T-SQL Tuesday Roundup

Robert Davis hosted T-SQL Tuesday, and he has a nice roundup of posts:

Thanks to everybody that participated in this month’s T-SQL Tuesday. A big thanks to everyone who wrote a participating blog post, and a really huge thanks to everyone who read the posts shared by this month’s participants. If you follow one of the links on this round-up page, I will kindly ask that you leave a comment on a blog post that you read if it teaches you something, gives you a new perspective, or makes you think. A blog post is just a bunch of words until somebody reads it, and blog writers love it when they hear that their post resonated with someone on some level.

Thanks to Robert for hosting T-SQL Tuesday #74.

Comments closed

Trust Relationship Failed

Chrissy LeMaire ran into a trust issue one one node of a cluster:

You know what’s scary as hell? When one node of an important cluster loses its trust relationship with the domain and you see the error “the trust relationship between this workstation and the primary domain failed”. That happened to me late last year with one of my SQL Server 2008 R2 nodes. The scary part was that I just didn’t know what to expect. The fix could be simple, or it could require a node rebuild.

Trust no one.

Comments closed

Setting Up A Scrub Server

Tom Norman talks about setting up a data scrubbing server:

With that said, when I setup a Scrub server, it is in a very secure area where the data access is very, very limited. For example, in my current company, the server is in a separate domain from Production and QA/Dev. Only DBAs are allowed to access this server. If you have multiple DBAs at your location, you may want to even limit which DBAs have access to this server. Our goal is to automate the entire scrubbing process so no one has to access the data including copying backup files from Production and to a shared scrub location for QA/Dev to retrieve.

Scrub servers are a way of stripping personally identifiable or sensitive information from production data so developers can safely use the data in lower, less secure environments.

Comments closed