Press "Enter" to skip to content

Category: Deployment

Deploying Azure Data Services via Terraform

Chris Adkin has started a series on deploying Azure Arc enabled Data Services. Part 1 serves as an introduction

:One of the most significant things to change the landscape for Azure data professionals will be general release of Azure Arc enabled Data Services. To provide an expedient means of experiencing all that Azure Arc has to offer, Microsoft has come up with Jumpstart – a collection of GitHub repos for deploying Arc in different scenarios. Last Christmas I had a few vacation days and took the opportunity to try out Jumpstart for Azure Arc enabled data services on AWS. AWS was my choice because it made a certain amount of sense to try out Azure Managed SQL Server instances and Postgres Hyperscale on a cloud that they are not natively available on. After all, the whole point of Azure Arc enabled Data Services is to bring Azure to you on your terms if for any reason you cannot use the Azure cloud. 

Part 2 gives us an introduction to Terraform:

Before diving into what the various Terraform modules do that make up the Arc-PX-VMware-Faststart repo, I’m going to provide an introduction to Terraform in this blog post. Terraform comes from Hashicorp, it is a tool that works on the principle of infrastructure-as-code. Resources are specified in what are called configuration files using Hashicorp Control Language in a declarative manner, i.e. you state what you want and to the best of its ability Terraform attempts to create those resources for you. ‘Providers’ are used to create resources for particular types of entity, for example you might use local file, helm (the Kubernetes package manager), Azure, VMware providers etc. etc. . . . Using providers requires plugins, most of which are provided by Hashicorp, but third parties can write their own plugins also.

Check out the first two posts in what promises to be an interesting series.

2 Comments

Using Terraform to Tag Created Date

John Martin has an interesting use case for tagging in Terraform:

One of the key properties missing from Azure resources, in my opinion anyway, is a CreatedDate. This can be largely overcomes with Azure policy, but what if you don’t have access to create one that applies a timestamp tag at resource creation?

It is possible to use Terraform to tag the resource and set the value for when the resource is created. There is a little more work that needs to go into it to ensure that once it is set that Terraform does not overwrite it on subsequent deployments. But, it is achievable and brings this into your control if needed.

Click through to see how.

Comments closed

Creating a Database Publish Profile in Visual Studio

Elizabeth Noble shows us how to create a database publish profile using Visual Studio:

One of our fears was always how to prevent losing data and critical data code. Here were publish profiles to our rescue. We also found that some of our database code had specific values depending on the environment or contained references to other databases. Once again, publish could solve these problems!

While I’d love to say that you could use ADS to manage your database projects, that just isn’t true right now. However, we have a way to help you get a publish profile created. If you don’t want to use Visual Studio yourself, you might want to ask your Developer friends real nice and see if they’d be willing to help you out.

Click through for a video and a sample of what a publish profile looks like.

Comments closed

Combining SendTo and Powershell

Mark Wilkinson shares a script with us:

If you are not familiar, SendTo options are those available when you right click on a file/folder in file explorer and select the Send To option in the menu. When you use this option, the currently selected files/folders are passed to the SendTo shortcut as a space delimited list of files and folders. This is important to know so you better understand what needs to be done to read that list.

I can confirm that this works well for deploying script out, especially when they need to go to multiple servers or multiple databases on servers. That functionality takes a bit more effort to write, but combine Mark’s code with Jess’s and you are well on your way.

Comments closed

Deploying Bacpacs to Azure SQL Database via Terraform

John Martin shows how to deploy a database schema (in bacpac format) via Terraform:

It’s all well and good deploying Azure SQL Database resources as we did in the previous post. However, databases tend to work a little better with a schema and some data in them. One of the options for getting data from an on-premises SQL Server database into Azure SQL Database is via a bacpac. This is, at its core, an export of the schema and data into a single file which is then created and loaded to Azure SQL Database. Much the same as a MySQL dump operates.

Read on for one way to do this.

Comments closed

Optimizing a SQL Server 2019 Project for a Dedicated SQL Pool

Kevin Chant shows us how we can modify a database schema intended for SQL Server 2019 to work best with an Azure Synapse Analytics dedicated SQL pool:

In this post I want to cover how you can transform your SQL Server database schema for a dedicated SQL Pool if you are using Azure DevOps. Because I covered it at Data Toboggan over the weekend and it can be very useful.

By the end of this post, you will know one way you can transform the schema of a database project for SQL Server 2019 if you are using Azure DevOps. So that you can make it optimal for dedicated SQL Pools.

Click through for the process and an example. Note that this isn’t a quick “check this box and you’re done” type of solution, but if you already have a proper star schema, this will help you think through some of the things you’ll need to do.

Comments closed

Deploying ADS Database Projects Manually

Elizabeth Noble continues a series of videos on database projects in Azure Data Studio:

This week, we’ll talk about one of the easier ways to deploy your database changes. One of the benefits of database projects is that they can generate data-tier applications (DAC). The data-tier applications can be bundled into what is called a DACPAC. This is a collection of files that can be used to deploy your database.

Click through for the video.

Comments closed

Issues Deploying Azure Synapse Analytics via ARM Template

Paul Andrew hits on some growing pains:

Just last week we heard the announcement from Microsoft that Azure Synapse Analytics is now generally available (GA)… A full year on, plus a few weeks, since first seeing Synapse at the big USA conferences in November 2019.

Today I’ve been attempting to use the resource with a view to implementing it for several customer projects. Although GA, it would seem that many part of the technology are far from ready.

In this brief blog I’m exposing some of the pain I’ve faced so far in simply trying to deploy a second instance of Azure Synapse Analytics using ARM templates.

Click through for three that Paul found. I’d expect that most of these will be tidied up in the next few months.

Comments closed

Deploying to Multiple SQL Server SKUs with Azure DevOps

Kevin Chant wants to deploy to all of the SQL Servers:

To give the above pipeline a bit more context the below types of SQL Server databases were updated after being unit tested (initial unit testing yaml courtesy of Sander):

– Three SQL Server 2019 instances in three Docker containers. Representing Integration, Staging and Production environments.
– At the same time the Git repository in Azure Repos would sync with a GitHub Repo. Which would then start a GitHub Action to update another database.
– An Azure SQL Database.
– Finally, a Synapse Analytics SQL Pool was also updated.

Read on to learn how.

Comments closed