Press "Enter" to skip to content

Category: Deployment

Problems with Power BI’s Publish to Web

Adam Saxton explains when you might not want to use the Publish to Web option in Power BI:

Some don’t realize that Power BI Publish to Web is not secure. Adam shows you that this is the case. It’s a bit scary and there are other options to have secure embedding.

For demos and other resources which are supposed to be accessible to everybody, Publish to Web works great. But if you’re deploying company dashboards, not so much.

Comments closed

Building dacpac Files on Non-Windows Machines

Erik Ejlskov Jensen provides another advantage for Azure SQL Database’s database projects:

For a while now, it has been possible to publish a .dacpac file (meaning apply it to an new or existing database) using the cross-platform version of sqlpackage.

But authoring and building a database project (sqlproj) was only possible on Windows, as the .sqlproj project type is based on the classic .NET Framework .csproj project type.

Now, thanks to the new Database Project extension in Azure Data Studio Insiders build, it is now possible to author, build and manually publish a SQL Server Database project.

And by using the new MsBuild.Sdk.SqlProj SDK and project type, is is also possible to build and publish a Database Project from a build agent (CI pipeline), without having to install the sqlpackage tool. Read on!

You heard Erik.

Comments closed

Release Rollback with Helm

Andrew Pruski shows the secret of how Helm lets you roll back releases even when deployments are deleted:

If we rollback with kubectl rollout undo the pods in the newest replicaset are deleted, and pods in an older replicaset are spun back up, rolling back the upgrade.

But there’s a potential problem here. What happens if that old replicaset is deleted?

If that happens, we wouldn’t be able to rollback the upgrade. Well we wouldn’t be able to roll it back with kubectl rollout undo, but what happens if we’re using Helm?

Read on to learn how the whole thing works.

Comments closed

The Importance of Power BI Deployment Pipelines

Marc Lelijveld explains the importance of Power BI Deployment Pipelines:

You might have seen the announcement, Power BI Deployment Pipelines have been released in May 2020. It is around for about two months now. On different social channels I have seen a lot of buzz around it already, both positive and negative honestly. Though, I think this is a great step forward!

Back in 2018, I posted a blog about multi-tier architecture and continuous delivery with Power BI. If you are not familiar with a DTAP approach and why this helps you to structure your development processes, I advise you to first read that blog. Personally, I am really excited about Deployment Pipelines! With this functionality, Microsoft starts offering an out-of-the-box functionality that helps you to easier move your Power BI content through you DTAP pipeline.

I think it’s a pretty big step in the right direction, though the “Why this isn’t so great” section is a bit lengthy.

Comments closed

Methods for Deploying a dacpac

Erik Ejlskov Jensen shares some advice when deploying dacpac files:

I have previously blogged about using a SQL Server Database Project together with EF Core and also described a NuGet package that enables you to build a .dacpac with .NET Core, even on Linux and macOS.

So the two blog posts above cover development and build. Then next step is deployment.

The main deployment mechanism for making changes to your database based on your recently built .dacpac file, is the cross-platform sqlpackage command line tool.

You can, depending on your requirements, take advantage of several of the available actions this tool provides.

Read on to see two methods for deployment.

Comments closed

Publishing a dacpac with .NET Core

Erik Ejlskov Jensen shows how to deploy a Visual Studio database project from .NET Core:

In this post, I will describe how you can build a SQL Server Database project in order to create a .dacpac file, using .NET Core only – dotnet build.

For a while now, it has been possible to publish a .dacpac (meaning apply it to an new or existing database) using the cross-platform version of sqlpackage.

But building a database project (.sqlproj) was only possible on Windows, as the .sqlproj project type is based on the classic .NET Framework .csproj project type.

However, thanks to a smart open source effort, you can now also build a .dacpac file, even on a Mac or Linux build agent.

Read on to learn more.

Comments closed

Deleting Old Build Definitions in Azure DevOps

Mark Broadbent solves a problem for us:

I have been experiencing a problem for quite a while now in my current environment, in that some of our old builds cannot be deleted. When you attempt to do so it results in the following error:

One or more builds associated with the requested pipeline(s) are retained by a release. The pipeline(s) and builds will not be deleted.

Many of our pipelines have undergone a lot of change over time to the degree it is not even obvious anymore why (or indeed where) these builds are being prevented from being dropped. The only thing that is clear is that until they can be, the old build definitions will remain.

Regardless of the reason why, Mark has the answer for how.

Comments closed

CI/CD with Databricks

Sumit Mehrotra takes us through the continuous integration story around Databricks:

Development environment – Now that you have delivered a fully configured data environment to the product (or services) team in your organization, the data scientists have started working on it. They are using the data science notebook interface that they are familiar with to do exploratory analysis. The data engineers have also started working in the environment and they like working in the context of their IDEs. They would prefer a  connection between their favorite IDE and the data environment that allows them to use the familiar interface of their IDE to code and, at the same time, use the power of the data environment to run through unit tests, all in context of their IDE.

Any disciplined engineering team would take their code from the developer’s desktop to production, running through various quality gates and feedback loops. As a start, the team needs to connect their data environment to their code repository on a service like git so that the code base is properly versioned and the team can work collaboratively on the codebase.

This is more of a conceptual post than a direct how-to guide, but it does a good job of getting you on the right path.

Comments closed

Scripting and Deploying SQL Agent Jobs

Alex Yates shows how you can incorporate SQL Agent jobs in your CI/CD process:

Basically, we need to put all the SQL Agent Job .sql scripts into a git repo. Then we need a PowerShell script that executes each .sql script against the necessary target databases. If you use SSDT, you might prefer to use a post deployment script to do this. That bit should be reasonably straight forward. I’ll leave that as a task for the user since I’m short on time.

You probably want to put some thought into whether your agent jobs are scoped to a particular database, general server admin for a specific server, or whether you want them to be standardised across many servers since this may affect where you choose to put your jobs ion source control and on what schedule you want to deploy them.

It may also make sense to set up MSX if you have a central server. That would make Agent job deployment easier and you can still script out which sets of servers get which jobs.

Comments closed

Release Flow Branching and Database DevOps

Kendra Little explains why the Azure DevOps Release Flow model can work well for database activity:

But how do you use branches? It’s helpful to pick a strategy. There are many fine Git branching strategies out there, things like GitFlow and GitHub Flow and more — enough that it’s overwhelming to learn about these when you are just starting out.

The strategy that I recommend for folks who are starting out with database DevOps and Git is the Azure DevOps team Release Flow model with dedicated development databases. (Why dedicated development databases? Read more here.)

Read on to learn why.

Comments closed