Press "Enter" to skip to content

Category: DevOps

The Value of MLOps

Tori Tompkins explains what MLOps is and why it’s valuable:

A ML project will typically begin in an ‘Explore Phase’ where a data scientist or team of data scientists will explore the data they currently have and experiment with models, algorithms, parameters and features. MLOps at this stage is responsible for supplying Data Scientists with environment they need to achieve this. One way this can be done is by leveraging Feature Store.

A feature store is a tool for storing commonly used features. As data scientists create new features then can log these into feature stores such as Feast and Databricks Feature Store, they can reuse these features across teams and projects. This will benefit teams in multiple ways by reducing compute times for both training and inference, provide consistency in common features and reducing effort for create complex logic.

Read on for information about all six phases.

Comments closed

Deploying an Azure Function via Azure DevOps

Koen Verbeeck wants to deploy a Powershell-based Azure Function:

In the blog post Azure Function with PowerShell and the Power BI REST API I explained how you could create an Azure Function using the PowerShell scripting language. This Function connected with the Power BI REST API and retrieved the last refresh status of a dataset. Developing the Function is one thing, deploying it is another. In this blog post I’ll guide you through the set-up of a build and release pipeline in Azure Devops. As a prerequisite, the Azure Function and its dependencies (for example the requirements.psd1 file) are all checked into a Git repo. As a reminder, the folder structure looks like this:

Read on for the walkthrough.

Comments closed

Microsoft.Build.Sql for Database Projects

Drew Skwiers-Koballa announces a new way of handling database projects:

Declarative development creates an environment where developers can focus on creating database objects while relying on the support of tooling locally and and in deployment pipelines to manage applying the differential changes calculated on the current state of the target database. Developers create objects such as tables or stored procedures by writing their definition with CREATE statements in scripts that live in source control just as if it is source code for any component of an application. Existing functionality for SQL projects in Visual Studio, Azure Data Studio, and VS Code provides developers with declarative development capabilities, however the existing SQL project file format has a few limitations.  With Microsoft.Build.Sql and SDK-style SQL projects, we look forward to unlocking new scenarios for your development practices.

It does sound interesting.

Comments closed

Keeping Secrets in Azure DevOps

Kevin Chant has a secret:

In this post I want to cover how you can keep your Azure Synapse secrets secret in Azure DevOps. Because you need to do this if you are working with production deployments.

With this in mind, I want to raise more awareness about it and make sure others avoid putting secrets directly in their pipelines like in the below example.

Read on to understand what options are available to you. My preference involves Key Vault references but there are alternatives available.

Comments closed

Using the master dacpac in Azure DevOps

Koen Verbeeck makes use of system databases in a database project:

I have a database project in Visual Studio. Inside the database, I use a couple of system views to fetch some metadata about tables. To make the project build successfully, you need to add a reference to the master database in the project.

That all works fine but there’s a bit more you need to do before Azure DevOps can work with the file. Read on to learn what that thing is.

Comments closed

Deploying SQL Scripts via Azure Release Pipelines

Meagan Longoria solves a problem:

We chose release pipelines over the YAML pipelines because it was easier to manage for the client and pretty quick to set up. While I had done this before, I had a couple of new challenges:

– I was deploying to an Azure SQL managed instance that had no public endpoint.

– There were multiple databases for which there may or may not be a change script to execute in each release.

This took a bit longer than I expected, and I enlisted my coworker Bill to help me work through the final details.

Read on to see how Meagan and Bill solved the problem.

Comments closed

Azure ML and MLOps

I continue a series on Azure ML:

We ended the prior series with model deployment via the Azure ML Studio UI. This is entirely manual and UI-driven. Then, we looked at model deployment via manually-run notebooks. This is still manual but at least offers the possibility of automation as we control the code to run.

From there, we moved to model deployment via the Azure CLI and Python SDK. Now we have the capability to run, train, register, and deploy models via scripts. This leads to the next phase in the process, in which we can perform continuous integration and continuous deployment of models using a tool like Azure DevOps or GitHub Actions. This is where MLOps starts to shine.

Read on for a few thoughts about MLOps and software maturity.

Comments closed

VM Creation via ARM Template

Martin Schoombee keeps customer software separated:

As a consultant I work on at least a few projects at a time, and prefer to isolate my development environments by creating an Azure VM for each customer. Isolating the environments are great because I can focus on the software and setup I need for that customer, and will never be in a situation where VPN clients or different software versions clash with each other.

With my development environments in Azure I am truly mobile, can work from anywhere and can lose my working machine at any point without much impact beyond getting another one.

Click through to see how Martin can do this with hardly a problem.

Comments closed

Getting Started with Azure Bicep

Jonathan D’Aloia looks at Azure Bicep:

This is going to be the first a few blogs in a series related to Azure BICEP. I will start the journey from the very beginning by showing you how to configure a local environment all the way to automating bicep deployments through multi-stage YAML Pipelines, covering how you can scale your infrastructure quickly and effectively.

In this blog, I will give a brief introduction to Azure BICEP and will also cover the easiest way to configure an environment locally ready to build and deploy your bicep templates.

Read on for the setup portion of the series.

Comments closed

Temporal Tables and Azure DevOps Deployments

Rayis Imayev notes a problem with Azure DevOps deployments:

Here is one thing that still doesn’t work well when you try to alter an existing temporal table and run this change through the [SqlAzureDacpacDeployment@1] DevOps task, whether this change is to add a new column or modify existing attributes within the table. Your deployment will fail with the “This deployment may encounter errors during execution because changes to … are blocked by …’s dependency in the target database” error message.

Read on to see what causes this problem and what we can do to work around it.

Comments closed