Press "Enter" to skip to content

Category: DevOps

Using the master dacpac in Azure DevOps

Koen Verbeeck makes use of system databases in a database project:

I have a database project in Visual Studio. Inside the database, I use a couple of system views to fetch some metadata about tables. To make the project build successfully, you need to add a reference to the master database in the project.

That all works fine but there’s a bit more you need to do before Azure DevOps can work with the file. Read on to learn what that thing is.

Comments closed

Deploying SQL Scripts via Azure Release Pipelines

Meagan Longoria solves a problem:

We chose release pipelines over the YAML pipelines because it was easier to manage for the client and pretty quick to set up. While I had done this before, I had a couple of new challenges:

– I was deploying to an Azure SQL managed instance that had no public endpoint.

– There were multiple databases for which there may or may not be a change script to execute in each release.

This took a bit longer than I expected, and I enlisted my coworker Bill to help me work through the final details.

Read on to see how Meagan and Bill solved the problem.

Comments closed

Azure ML and MLOps

I continue a series on Azure ML:

We ended the prior series with model deployment via the Azure ML Studio UI. This is entirely manual and UI-driven. Then, we looked at model deployment via manually-run notebooks. This is still manual but at least offers the possibility of automation as we control the code to run.

From there, we moved to model deployment via the Azure CLI and Python SDK. Now we have the capability to run, train, register, and deploy models via scripts. This leads to the next phase in the process, in which we can perform continuous integration and continuous deployment of models using a tool like Azure DevOps or GitHub Actions. This is where MLOps starts to shine.

Read on for a few thoughts about MLOps and software maturity.

Comments closed

VM Creation via ARM Template

Martin Schoombee keeps customer software separated:

As a consultant I work on at least a few projects at a time, and prefer to isolate my development environments by creating an Azure VM for each customer. Isolating the environments are great because I can focus on the software and setup I need for that customer, and will never be in a situation where VPN clients or different software versions clash with each other.

With my development environments in Azure I am truly mobile, can work from anywhere and can lose my working machine at any point without much impact beyond getting another one.

Click through to see how Martin can do this with hardly a problem.

Comments closed

Getting Started with Azure Bicep

Jonathan D’Aloia looks at Azure Bicep:

This is going to be the first a few blogs in a series related to Azure BICEP. I will start the journey from the very beginning by showing you how to configure a local environment all the way to automating bicep deployments through multi-stage YAML Pipelines, covering how you can scale your infrastructure quickly and effectively.

In this blog, I will give a brief introduction to Azure BICEP and will also cover the easiest way to configure an environment locally ready to build and deploy your bicep templates.

Read on for the setup portion of the series.

Comments closed

Temporal Tables and Azure DevOps Deployments

Rayis Imayev notes a problem with Azure DevOps deployments:

Here is one thing that still doesn’t work well when you try to alter an existing temporal table and run this change through the [SqlAzureDacpacDeployment@1] DevOps task, whether this change is to add a new column or modify existing attributes within the table. Your deployment will fail with the “This deployment may encounter errors during execution because changes to … are blocked by …’s dependency in the target database” error message.

Read on to see what causes this problem and what we can do to work around it.

Comments closed

Automating Pipeline Migration to Synapse via Azure DevOps

Kevin Chant deploys some Synapse pipelines:

In this post I want to cover how you can automate a pipeline migration to a Synapse workspace using Azure DevOps. As a follow up to a previous post I did about one way to copy an Azure Data Factory pipeline to Synapse Studio.

Because even though the post is good it deserves a follow up showing an automated way of doing it. I wanted to show that it can be done more gracefully.

And we all want to be graceful, right?

Comments closed

MLOps on Databricks

Piotr Majer and Michael Shtelma complete a series on MLOps on Databricks:

This is the second part of a two-part series of blog posts that show an end-to-end MLOps framework on Databricks, which is based on Notebooks. In the first post, we presented a complete CI/CD framework on Databricks with notebooks. The approach is based on the Azure DevOps ecosystem for the Continuous Integration (CI) part and Repos API for the Continuous Delivery (CD). This post extends the presented CI/CD framework with machine learning providing a complete ML Ops solution.

Check it out.

Comments closed

Combining Azure DevOps and Databricks

Anna Wykes continues a series on DevOps for Databricks:

An Environment Variable is a variable stored outside of the Python script; in our instance it will be stored on the DevOps Agent running the DevOps Pipelines. Consequently, it is accessible to other scripts/programs running on the DevOps Agent. We will not cover DevOps Agents in this blog specifically, the simplest description is that they are the compute that runs your pipeline, normally a VM (Virtual Machine) or Docker Container

Read the whole thing.

Comments closed

DevOps for Databricks

Anna Wykes starts off with bad news:

In this blog series I explore a variety of options available for DevOps for Databricks. This blog will focus on working with the Databricks REST API & Python. Why you ask? Well, a large percentage of Databricks/Spark users are Python coders. In fact, in 2021 it was reported that 45% of Databricks users use Python as their language of choice. This is a stark contrast to 2013, in which 92 % of users were Scala coders:

What is wrong with the world today?

Semi-seriously, though, do read Anna’s post, as it covers a variety of things you can do with the Databricks REST API, including cluster management and monitoring. I might be jumping the gun a bit, but I am a big fan of Gerhard Brueckl’s Powershell module for Databricks for this kind of work.

Comments closed