Press "Enter" to skip to content

Category: Deployment

Executing GitHub Actions via CLI

Kevin Chant uses the GitHub CLI:

In this post I want to share some advice about using GitHub CLI with GitHub Actions for Data Platform deployments. Because I showed that at SQLDay last week.

For those who were not aware, there is a GitHub CLI you can use from the command line. You can download GitHub CLI from here.

Anyway, GitHub CLI was recently updated to support commands for GitHub Actions. GitHub Actions is the CI/CD mechanism that is now available in GitHub. Which I have covered in a few other posts, including the one you can find by clicking here.

Click through to learn more.

Comments closed

Using Database Projects for Declarative Database Development

Haroon Ashraf explains the principles behind database projects and demonstrates their use:

This article is all about declarative database development using Azure Data Studio for both beginners and professionals who are new to it.

Additionally, some professional life tips in the context of the topic are also shared. The importance of declarative database development over its counterparts can also be fairly understood by going through this article.

Conceptually, I love it. Focusing on the end state is easier to understand. The problem I run into is that the tooling for generating change scripts is not great. It works for trivial database sizes, but as soon as you start talking dozens or hundreds of gigabytes of data, database projects have a tendency to do rather drastic changes which require rebuilding the table, when they could (with a bit of human smarts) perform an action which is much less disruptive. So in the end, you still end up needing to create change scripts.

Comments closed

Deploying from One Source to Multiple SQL Servers with GitHub Actions

Kevin Chant demystifies GitHub Actions:

In this post I want to share how to deploy from one source to multiple SQL Server database types using GitHub Actions. Because I did a demo of it at Data Saturday Redmond last weekend.

By the end of this post, you will know more about how to do this using GitHub Actions. If you are used to Azure DevOps, you will find this an interesting comparison.

Previously I did a post about how you can do this using Azure DevOps. You can read that post in detail here. Later in this post I also mention an older post here a couple of times so it’s worth keeping that open.

Read on to learn how.

Comments closed

Granular Deployment of Power BI Changes with ALM Toolkit

Gilbert Quevauvilliers shows off an interesting scenario:

In this blog post I am going to demonstrate how to make a granular deployment where I will create a new column in my City table, and only deploy those changes.

What this means is that by deploying only the column change to my PPU dataset, I am only updating the column in the table.

This now saves me from doing the following tasks previously:

– Time taken to refresh the PBIX file so that the data is up to date.
– Re-uploading my PBIX.
– If configured re-creating the incremental refreshing
– Time and effort to upload and wait for dataset refresh.
– Quick updates to my dataset.

I will not have to worry about saving my PBIX file, file and if configured re-creating the incremental refreshing. This saves me a lot of time and effort.

Click through to see those steps in action.

Comments closed

Error Messages on SSDT Database Project Deployments

Chris Johnson has some advice if you’re hitting an error when deploying a SQL Server Data Tools database project:

Today I’d like to talk about three error messages you might see when deploying an SSDT database project, either through Visual Studio or via a dacpac and script. I’m going to focus here on what you see from inside of Visual Studio, but you will see similar errors returned when you deploy using a script and the reasons behind them will be the same.

Read on for Chris’s findings. These errors definitely aren’t a complete survey of possible messages, but they do hit some of the less obvious cases.

Comments closed

Deploying Azure Data Services via Terraform

Chris Adkin has two additional parts of a series. Part 3 shows us how to deploy a virtual machine on VMware:

To do this you require an Ubuntu virtual machine, I’ve tested this with Ubuntu 18.04 LTS and I will get around to testing it with Ubuntu 20.10 at some stage. If for example the virtual machine was created with a user called azuser, the deployment server should also have an azuser account under which all Terraform commands are executed. 

Part 4 takes those VMs and set up a Kubernetes cluster across them:

Whatever you do when deploying a Kubernetes cluster, somewhere along the line you have to use kubeadm. There is a wealth of material available on blog posts and on the internet in general in which people roll there own scripts using kubeadm. I often suspect that many of these efforts are the result of Kelsey Hightower’s: Kubernetes the hard wayIn this post we are emphatically going to do things the easy way, […]

And now we’re caught up on the series…for the moment, at least.

Comments closed

Deploying Azure Data Services via Terraform

Chris Adkin has started a series on deploying Azure Arc enabled Data Services. Part 1 serves as an introduction

:One of the most significant things to change the landscape for Azure data professionals will be general release of Azure Arc enabled Data Services. To provide an expedient means of experiencing all that Azure Arc has to offer, Microsoft has come up with Jumpstart – a collection of GitHub repos for deploying Arc in different scenarios. Last Christmas I had a few vacation days and took the opportunity to try out Jumpstart for Azure Arc enabled data services on AWS. AWS was my choice because it made a certain amount of sense to try out Azure Managed SQL Server instances and Postgres Hyperscale on a cloud that they are not natively available on. After all, the whole point of Azure Arc enabled Data Services is to bring Azure to you on your terms if for any reason you cannot use the Azure cloud. 

Part 2 gives us an introduction to Terraform:

Before diving into what the various Terraform modules do that make up the Arc-PX-VMware-Faststart repo, I’m going to provide an introduction to Terraform in this blog post. Terraform comes from Hashicorp, it is a tool that works on the principle of infrastructure-as-code. Resources are specified in what are called configuration files using Hashicorp Control Language in a declarative manner, i.e. you state what you want and to the best of its ability Terraform attempts to create those resources for you. ‘Providers’ are used to create resources for particular types of entity, for example you might use local file, helm (the Kubernetes package manager), Azure, VMware providers etc. etc. . . . Using providers requires plugins, most of which are provided by Hashicorp, but third parties can write their own plugins also.

Check out the first two posts in what promises to be an interesting series.

2 Comments

Using Terraform to Tag Created Date

John Martin has an interesting use case for tagging in Terraform:

One of the key properties missing from Azure resources, in my opinion anyway, is a CreatedDate. This can be largely overcomes with Azure policy, but what if you don’t have access to create one that applies a timestamp tag at resource creation?

It is possible to use Terraform to tag the resource and set the value for when the resource is created. There is a little more work that needs to go into it to ensure that once it is set that Terraform does not overwrite it on subsequent deployments. But, it is achievable and brings this into your control if needed.

Click through to see how.

Comments closed

Creating a Database Publish Profile in Visual Studio

Elizabeth Noble shows us how to create a database publish profile using Visual Studio:

One of our fears was always how to prevent losing data and critical data code. Here were publish profiles to our rescue. We also found that some of our database code had specific values depending on the environment or contained references to other databases. Once again, publish could solve these problems!

While I’d love to say that you could use ADS to manage your database projects, that just isn’t true right now. However, we have a way to help you get a publish profile created. If you don’t want to use Visual Studio yourself, you might want to ask your Developer friends real nice and see if they’d be willing to help you out.

Click through for a video and a sample of what a publish profile looks like.

Comments closed