Press "Enter" to skip to content

Category: Backups

Managing File Retention in Blob Storage

Jeet Kainth shows how to configure a retention strategy in Azure Blob Storage:

This blog is a follow up to a previous blog I wrote about backing up Azure Analysis Services cubes in Azure, that blog can be found here. This blog shows how to implement a retention policy using PowerShell in Azure Runbooks to remove the backups after a set number of days. To create a new Runbook in the Azure portal, go to the relevant Automation account in the relevant resource group and then select Runbooks from the left hand pane. Note you will need to add the Az.Storage module to the automation account to be able to use some of the commands listed in this blog.

Click through for the process, including Powershell code to perform the task.

Comments closed

Restoring a Database from Backup in Docker

Chad Baldwin has a container in search of a database:

Yesterday, I was watching a Pluralsight course which provided a database .bak file to follow along with the examples. I generally like to use Docker when working with SQL Server locally…but as a somewhat novice user, I have found it to be a bit of a pain if you need to deal with restoring or attaching a database.

When I run into these scenarios, I usually spin up an AWS EC2 instance, install SQL server, and work with it that way. There’s probably a simpler way to do it using RDS or Azure, but I’m not familiar with those just yet. The other option is if I have a Linux machine at hand, I will use that with Docker and mapped volumes work great.

I do happen to have a Linux machine ready to use…but I was determined to figure out how to get this working on Windows.

Bonus points for using RESTORE DATABASE syntax. Every SQL Server user should know how to back up and restore a database using only T-SQL. That’s a skill which will definitely pay dividends.

Comments closed

Backing Up a Power BI Premium Database

Gilbert Quevauvilliers wants you to back that thing up:

Continuing with my series of using Power BI Premium Per User (PPU), today I am going to show you how to back up your PPU database.

As far as I am aware all the options below will work for Power BI Premium as well.

To me this is critical when my dataset size grows. Especially when it takes multiple days to process all the data into the required partitions.

Not only is having a backup best practice, if something must go wrong with a deployment (let’s say I wipe out the partitions by mistake) it will be quick and easy to restore from a backup.

Read the whole thing.

Comments closed

Four DBA ToDos in a New Role

Lee Markum has a starting point for DBAs in a new role:

You’ve just been hired into a DBA role at a new company, or you’ve been given the DBA keys at your current company. Maybe you’re a SysAdmin and your boss has informed you that you are now supposed to manage the SQL Servers as well as everything else on your plate. In any of these situations, you may have some confidence in your skills, but especially in the case of being a new hire, you have absolutely no true idea of what you’re walking into.

In these scenarios, where do you start? Start with these four areas.

Click through for the four areas. I completely agree with Lee on these for DBAs, including the order.

Comments closed

Automatic Backups on a Data Lake or Lakehouse

Dave Ruijter backs that thing up:

Out of the box, Azure Data Lake Storage Gen2 provides redundant storage. Therefore, the data in your Data Lake(house) is resilient to transient hardware failures within a datacenter through automated replicas. This ensures durability and high availability. In this blog post, I provide a backup strategy on how to further protect your data from accidental deletions, data corruption, or any other data failures. This strategy works for Data Lake as well as Data Lakehouse implementations. It uses native Azure services, no additional tools, software, or licenses are required.

Read on for a detailed strategy.

Comments closed

Performing a Restore to SQL Managed Instance

Arun Sirpal shows us how to perform a backup and restoration from an on-premises SQL Server to Azure SQL Managed Instance:

So in the last blog we confirmed that we could move to SQL MI via some analysis, this is now time to actually do a backup and restore via URLs to move data.

Quite simply you need to BACKUP to URL (Azure Storage container) and the setup requirement is that you need to create a SQL credential that holds the SAS token – this is what allows authentication to the container to take place. 

Click through for the process.

Comments closed

The Reason for Tail Log Backups

Chad Callihan explains why we need tail log backups:

When you are migrating a database from one server to another, how can you be sure to backup all transactions? Sure, you can notify the client and let them know “there will be a short outage at 8AM so please stay out of the application at that time.” Can you really trust that? Of course not. Let’s demonstrate the steps needed to include all transactions with the tail-log backup.

Protip: if you build your application such that nobody wants to use it, you can migrate the database much more easily. Assuming you don’t want to follow that outstanding advice, Chad has you covered.

Comments closed

Troubleshooting a Slow Restore

Sean Gallardy performs corporate dentistry:

This came with very little to no data available, and to be quite honest, saying “slow restore” doesn’t really mean much. The initial analysis needs to be an actual set of concrete data that describes the issue, what is normal, and what outliers, if any, exist. Since we have none, we can’t even start to analyze anything, so we need to clarify the problem statement and understand a little more about the issue.

This is an interesting dive into the problem and a good example of how to work with “We won’t let you see/do that” as a consultant. Incidentally, if you haven’t heard of WPR, that comes with the Windows Performance Toolkit.

Comments closed