Press "Enter" to skip to content

Category: Deployment

Deploying SSDT Scripts

Richie Lee has concerns with database deployments:

At any rate, the script is generated and maybe reviewed….. so then what? In SSDT there is no way to create and deploy script in one step; they are two distinct steps. And even if they were one step, this would still not resolve the issue that troubles me. So what is this issue?

The issue is that by creating a script, and then running the deploy, you cannot be sure that the database is in the exact same state that it was when the initial script was generated. If you don’t already know, SSDT runs a deploy entirely in memory, so as mentioned there is no script created. You have to explicitly create the script as part of the process. Or, if you have already created one, you have to re-create the script.

I’m on the fence here.  In simpler environments, I think Richie has a good point.  But in a complex environment, I wouldn’t even use auto-generated deployment scripts; when you’re changing hundreds of database objects (including adding and modifying columns, backfilling data, modifying indexes, etc.), that automated deployment script is almost guaranteed to fail.  And if it does fail, it could leave you in a state of irreparable harm.

Comments closed

Deploying SSIS Packages In VS 2015

Neil Gelder notes that you can deploy different versions of SSIS packages using Visual Studio 2015:

For years I’ve dream’t of having one set of tools for developing SSIS packages! not a lot to ask really and  great step towards this from Microsoft was decoupling the development IDE from the main SQL Server install to produce the standalone SSDT (SQL Server data tools)

But like most people I work in an environment which has legacy versions for SQL Server in production, but equally like most tech folk (giddy kids wanting new toys) I always try and use the most current and exciting  version of VS.  This however proves a problem when developing for SSIS, for example if you developed a SSIS package in VS 2013 you’d not be able to deploy this correctly to a SQL Server 2012 version of Integration services catalog.  In the past this resulted in having two IDE’s installed, SSDT 2012 (VS shell) for any 2012 catalog development and VS 2013 installed for other work.

I had one person mention during a talk I gave that this isn’t foolproof, but my experience (limited to SQL Server 2012 and 2014) was that deployment worked fine.  As always, test before making changes.

Comments closed

Dacpac Deployment Models

Ed Elliott discusses publish profiles as part of dacpac deployment scenarios:

For a while I meandered between the two approaches until the ssdt team announced that they had released a nuget package with the DacFx in and I decided that I would move over to that as it meant that I no longer had to check in the dll’s into source control which in itself is a big win. I also decided to fix the extensions thing and so figured out a (slightly hacky) way to get the DacFx dll’s in the nuget package to behave like sqlpackage and allow a sub-directory to be used to load dll’s – I fixed that using this powershell module that wraps a .net dll (https://the.agilesql.club/blogs/Ed-Elliott/DacFxed-Nugetized-DacFx-Power…). Now I have the problem of not having to check in dll’s and still being able to load contributors without having to install into program files sorted BUT I still had the problem of lots of command line args which I was sharing in powershell scripts and passing in some custom bits like server/db names etc.

I’m not very familiar with dacpacs, so this was an interesting read for me.

Comments closed

Deployment As Process

Ginger Grant argues in favor of deployment processes:

All sorts of things can happen when one person writes and deploys. I know someone who worked in the IT department for a large cell phone company. At the time, working there meant free phone service. One of the devs was a heavy user of the free phone service and so was his large extended family. His job was to maintain the billing code. After several questionable incidents at work, HR got involved and he was perp walked out of the building. Due to the circumstances surrounding his departure, his cell phone accounts were checked to ensure from this point on, he would get a bill. Although his account showed a number of active phones, his balance was always zero. The code in source control was checked and there was nothing in it which provided a reason why his bill was zero. Upon further investigation, my friend noticed the version number in production did not match the version number in source control. The code in source control was compiled and a huge balance appeared for the former employee. If someone else had deployed the code in source control, this chicanery would not have been possible.

This is an interesting read, so go check it out.

Comments closed

Continuous Integration

James Anderson is starting a series on continuous integration:

I had been using source control for years but it’s always felt like a tick box exercise that I was doing because I had to. I had never used it to review old versions to see where code went wrong or to quickly roll back changes if I decided I no longer wanted to go in a certain direction with the code. I never felt like I was getting anything back from using source control. Sometimes it takes a problem to arise for you to see the value of a solution.

In 2015 I started to inherit the code base for our internal maintenance database, the UtilityDB. This database is used to store performance metrics and to manage tasks such as index maintenance and backups. This database is installed on all of our instances.

This first post is an introduction to the series, and it looks like he’ll cover some heady topics.

Comments closed

Deploying Dacpacs

Richie Lee shows two different methods of automating Dacpac deployment:

DacFx, or to give it it’s full title, the Data-tier Application Framework “is a component which provides application lifecycle services for database development and management for Microsoft SQL Server and Microsoft Azure SQL Databases“. Essentially, it is another method we can use to manage our Dacpacs. However instead of using the external process SQLPackage and initiating it via cmdline you can use C# or PowerShell to manage Dacpacs. In fact, SQLPackage uses the “Microsoft.SqlServer.Dac.dll” itself. You can verify this by going and deleting the dll and trying to run sqlpackage via command line…. or you can just take my word for it.

Read on for the Powershell script Richie uses.

Comments closed

Compile SQL Project Files Using MSBuild

Richie Lee shows how to use MSBuild to deploy a SQL Server database project:

What is happening is that when a project is built in Visual Studio some targets that are external of the whole process that were installed as part of Visual Studio are used as part of the process. Obviously, when we run MSBuild via cmdline, we are not setting the parameter “VisualStudioVersion”, because this is pretty much a “headless” build. So the sqlproj file handles this by setting the parameter to 11.0, which is pretty horrible. Given that projects are created in Visual Studio, I would’ve felt that the version a project was created in would be baked in as the default, as opposed to just some random version number. At the very least,  a warning that a default version number has been set would come in useful, especially as this is 2016 now and the sqlproj files default to 2010 targets.

Fortunately, Richie has the answer to this issue.

Comments closed

Automating SSAS deployments

Matt Smith as introduced SQL Server Analysis Services deployments to Octopus Deploy:

The only thing missing was SSAS. After watching Chris Webb’s video tutorial –Cube Deployment, Processing and Admin on Project Botticelli, I decided it had to use Microsoft.AnalysisServices.Deployment.exe. After a bit of scripting and testing, I managed to write a PowerShell that updates the xml config files for the deployment – it sets the ProcessingOption to DoNotProcess’. It updates the Data source – where the cube will refresh the data from. The script isn’t perfect. For starters, what if you have more then one data source? Also what if your not using SQL Server 2014? Still the great thing about open source is that other can update it. Anyone can improve it, its not reliant on me having free time. So hopefully by the time we move to SQL 2016 someone will have already updated it to work with SQL 2016.

A big part of product maturation is automated deployment.  Good on Matt for introducing that to the community.

Comments closed

Think About Installation

Dave Mason implores us to think of the installation:

Every tsql command in your SQL script(s) has the potential to fail. It’s important to catch and handle tsql errors so that they don’t cause the entire installation to fail. This will require a lot of defensive, resilient, fault-tolerant coding on your part. Here’s an example for creating the database. Note the emphasis on permissions, which I touched on in another post.

This is important advice if you send installation scripts to customers (even if you’re using a packager to generate an install EXE).

1 Comment