Press "Enter" to skip to content

Category: Deployment

Creating a SQL Server 2022 dacpac

Kevin Chant gets an upgrade:

In this post I want to cover how you can create a dacpac for SQL Server 2022 databases using sqlpackage. So that you keep the new SQL Server 2022 compatibility level when you deploy new databases.

Just to clarify, a dacpac file is a special type of file which contains details about SQL Server database objects. Which you can use to deploy database updates to other SQL Server databases.

Read on for initial thoughts, a post-upgrade experience, and more.

Comments closed

Microsoft.Build.Sql for Database Projects

Drew Skwiers-Koballa announces a new way of handling database projects:

Declarative development creates an environment where developers can focus on creating database objects while relying on the support of tooling locally and and in deployment pipelines to manage applying the differential changes calculated on the current state of the target database. Developers create objects such as tables or stored procedures by writing their definition with CREATE statements in scripts that live in source control just as if it is source code for any component of an application. Existing functionality for SQL projects in Visual Studio, Azure Data Studio, and VS Code provides developers with declarative development capabilities, however the existing SQL project file format has a few limitations.  With Microsoft.Build.Sql and SDK-style SQL projects, we look forward to unlocking new scenarios for your development practices.

It does sound interesting.

Comments closed

Logic Apps: Source Control and Deployment

Koen Verbeeck has a two-parter. First up is storing Logic App code in source control:

At a data warehouse project I’m using a couple of Logic Apps to do some lightweight data movements. For example: reading a SharePoint list and dumping the contents into a SQL Server table. Or reading CSV files from a OneDrive directory and putting them in Blob storage. Some of those things can be done in Azure Data Factory as well, but it’s easier and cheaper to do them with Logic apps.

Logic Apps are essentially JSON code behind the scenes, so they should be included into the source control system of your choice (for the remainder of the blog post we’re going to assume this is git).

The second post covers deployment:

It’s easy to duplicate an Azure Logic App in a resource group, but unfortunately you cannot duplicate a Logic App between environments (you might try to copy paste the JSON though). So unless you want to hand craft every Logic App yourself on each of your environments, you need a way to automatically deploy your Logic Apps. It’s easier, faster and less error-prone than any manual method.

Check out both posts.

Comments closed

Database Code Reviews: a Process

Kenneth Fisher reviews some code:

I’ll be honest, ever since I did a SQL Homework about doing code reviews I’ve wanted to do a blog post about them. Recently Emily Krager (TikTok | Twitter) did a TikTok about code review suggestions which seemed like a good excuse for me to do this. If you don’t follow her I recommend it, she does a great job of combining humor and technology and is just a lot of fun to listen to. Here is her list as best I was able to transcribe it.

Click through for Kenneth’s thoughts on the topic.

Comments closed

Automating Power BI Data Model Metadata Extraction

Gerhard Brueckl avoids manual processes:

In the past I have been working on a lot of different Power BI projects and it has always been (and still is) a pain when it comes to the deployment of changes across multiple tiers(e.g. Dev/Test/Prod). The main problem here being that a file generated in Power BI desktop (.pbix) is basically a binary file and the metadata of the actual data model (BIM) cannot be easily extracted or derived. This causes a lot of problems upstream when you want to automate the deployment using CI/CD pipelines. Here are some common approaches to tackle these issues:

Click through to see several bad to palatable options and then check out Gerhard’s solution, which is significantly better. CI/CD is a huge pain point for Power BI developers but people like Gerhard are doing what they can to help.

Comments closed

Temporal Tables and Azure DevOps Deployments

Rayis Imayev notes a problem with Azure DevOps deployments:

Here is one thing that still doesn’t work well when you try to alter an existing temporal table and run this change through the [SqlAzureDacpacDeployment@1] DevOps task, whether this change is to add a new column or modify existing attributes within the table. Your deployment will fail with the “This deployment may encounter errors during execution because changes to … are blocked by …’s dependency in the target database” error message.

Read on to see what causes this problem and what we can do to work around it.

Comments closed

Automating Pipeline Migration to Synapse via Azure DevOps

Kevin Chant deploys some Synapse pipelines:

In this post I want to cover how you can automate a pipeline migration to a Synapse workspace using Azure DevOps. As a follow up to a previous post I did about one way to copy an Azure Data Factory pipeline to Synapse Studio.

Because even though the post is good it deserves a follow up showing an automated way of doing it. I wanted to show that it can be done more gracefully.

And we all want to be graceful, right?

Comments closed

SSIS Framework File Community Edition

Andy Leonard has an announcement:

The very first data integration / data engineering framework I ever wrote was for Data Transformation Services, or DTS. The DTS framework had one job: manage connections. I don’t recall all the details, but I remember DTS included a task that allowed packages to retrieve settings from INI files. INI files are key-value files, so I simply added entries with identical keys and different values – values that matched connection strings for each lifecycle tier – and placed each version of the INI file in the same location on every server in the lifecycle.

The next framework I wrote was for SSIS. I stored metadata in tables – including connections metadata – and created a concept I called an SSIS Application. An SSIS application is, according to my definition, a “collection of SSIS packages that execute in a pre-determined order.”

The SSIS Framework File Community Edition is very similar to this first framework, except for the connections management.

Click through to learn more about the SSIS Framework File Community Edition and check it out.

Comments closed

Using Azure DevOps to Deploy Python Functions to Azure Function Apps

Rayis Imayev has a trick question for us:

Can I create a CI/CD pipeline to deploy Python Function to Azure Function App using Windows self-hosted Azure DevOps agent?

My short answer to this question is Yes and NoYes, you can use Windows self-hosted Azure DevOps agent to deploy Python function to the Linux based Azure Function App; and, No, you can’t use Windows self-hosted Azure DevOps agent to build Python code since it will require collection/compilation/build of all Python-depended libraries on a Linux OS platform.

Click through for the full answer.

Comments closed

Building an SSMS Database Solution

Andy Leonard has a four-parter four us on database solutions in SQL Server Management Studio. Part one provides an introduction:

I like Microsoft Visual Studio a lot. I know some members of the team that developed Visual Studio, and they are scary-smart individuals who have forgotten more about developing software than I will ever know.

For some reason, I am not fond of SQL Server projects in Visual Studio. I believe the reason is that I am not familiar with the template. Please note I used the word fond intentionally. It’s an emotion. In this case, it’s all about me. I believe my emotion would change if I took the time to learn more about the Visual Studio SQL Server project template.

I continue to attempt to learn VS database projects. In the meantime, I prefer SQL Server Management Studio solutions.

Part two shows how to add a new query:

One solution is to add instrumentation to T-SQL scripts. I personally like to write T-SQL scripts that idempotent (a fancy way to describe “re-executable with the same results”). One way to write idempotent T-SQL is:

1. First check for the current state

2. Provide feedback (instrumentation) on the status

3. Provide more feedback on actions driven by the status (yep, more instrumentation)

Part three includes tables and views in the mix:

Click the “New Query” button in SSMS and add the following T-SQL:

Part four includes stored procedures:

Note the DDL to manage stored procedures is very similar to the DDL for managing views.

If all goes according to plan, the first execution of the s.i DDL T-SQL statement should generate the following messages:

Andy also shows how to use SQLCMD to create a proper deployment script.

Comments closed