Press "Enter" to skip to content

Category: Backups

Long-Term Backups on Azure SQL Database

Arun Sirpal takes us through a fairly new feature in Azure SQL Database:

There is a new (ish) interface to looking and configuring backups for your Azure SQL Database. This can be found within the settings section of the SQL Server.

As you can see, by default we have 7 days retention to allow for PITR – Point In Time Recovery, anything longer you will need to setup long term retention.

Click through to see how to set this up.

Comments closed

Saving Money on Backups to Azure Blob Storage

John McCormack has a few tips for saving some cash:

You have 5 databases on a SQL Server Instance. You take daily full backups of each database on your instance. You also take log backups every 15 minutes as each database is in full recovery mode. This means in 1 week, you will have 35 full backups and 3,360 transaction log backups. This multiplies to 1,820 full and 174,720 t-log backups over 52 weeks. Multiply this for 7 years or more and the costs can get very expensive.

Click through to see how you can save a considerable amount with a bit of planning.

Comments closed

Building a Backup Plan

Greg Larsen takes us through the steps of developing out a backup plan:

You might be wondering why you need to develop a backup plan. Can’t a DBA just implement a daily backup of each database and call it good? Well, that might work, but it doesn’t consider how an application uses a database. If you have a database that is only updated with a nightly batch process, then having a daily backup of the database right after the nightly update process might be all that you need. But what if you had a database that was updated all day long from some online internet application. If you have only one backup daily for a database that gets updated all day online, then you might lose up to a day’s worth of online transactions if it was to fail right before the next daily backup. Losing a day’s worth of transaction most likely would be unacceptable. Therefore, to ensure minimal data loss occurs when restoring a database, the backup and recovery requirements should be identified first before building a backup solution for a database.

The biggest non-secret here is that backup plans are ultimately business decisions rather than technical decisions. Greg then outlines several cases and provides considerations for each.

Comments closed

Finding the Physical Path of a SQL Server Backup on a Container

Jack Vamvas is looking for love files in all the wrong places:

I’m migrating some SQL Server databases to Openshift Containers. The SQL Server is set up with persistent disk , with a dedicated persistent disk partition for the SQL Server defaultbackup directory. I don’t have access to the underlying files via command line and can only use command line. How can I get the physical disk device , which will then allow me to create a RESTORE DATABASE statement pointing to the device?

Read on for the answer, including a T-SQL script to find where these files live.

Comments closed

An Intro to Backup Strategy with SQL Server

Pamela Mooney takes us through backup and restoration as part of a DBA in training series:

A DBA’s job is getting the right data to the right people as quickly as possible.

Consider that first part – getting the right data. How do you do that? By having the data in the first place. The best way to ensure that you have the data is to look at your company’s needs, recommend availability options, and most importantly, do backups. Few things will get a DBA fired more quickly than not having backups available when they are needed. It may be years before you ever have to restore a database from a backup, but the backups had better be there when you need to do one.

Conference sessions and books have been written on the subjects of backup/restores, SLAs, and availability options, and they are so good that I’ll just give you a brief overview here. You’ll gravitate to your need-to-learns soon enough.

This is a nice overview for a beginner. One semi-related piece of advice that I picked up from Sean McCown: know that backup and restore syntax cold. Take backups manually (even if they’re just on a test server intended for this purpose) every day until you feel comfortable typing out backup and restore syntax from memory. There will come a time when three levels of management are standing over your shoulder waiting for a database to restore and SSMS’s GUI is crawling. At that point, knowing the syntax cold will be completely worthwhile.

Comments closed

Backing Up the Service Master Key

William Assaf takes us through backing up important keys in SQL Server:

You should consider complimentary backup solutions that backup/snapshot the entire server (or VM) for SQL Server, but sometimes these technologies are limited or have too much of an impact on the server. A whole VM snapshot for example that is reliant on VSS could incur an unacceptable long IO stun duration when it occurs. 

Regardless, in all cases, SQL Server backups of each database should be taken regularly. This is a conversation for another blog post but a typical pattern is weekly full backups, nightly differential backups, and in the case of databases not in SIMPLE recovery model, 15 minute transaction log backups.

Read the whole thing.

Comments closed

Copying an Azure SQL Database

Garry Bargsley gives us two methods for copying an Azure SQL Database:

Copying an Azure SQL Database is a vital skill when managing cloud databases. Recently, a request was received from the “business”. They wanted to create a copy of an Azure SQL Database that was in a development environment. The database has been certified and early testing was accepted. They now want an exact copy in QA to start integration testing. The process of making an Azure SQL Database copy is straightforward. There are several different ways to perform this action.

Two methods chosen will use the Azure Portal and PowerShell to demonstrate the completion of this request.

Click through for the demos.

Comments closed

Five Methods to Take Backups

Lee Markum gives us five different methods for taking backups in SQL Server:

For those readers who like the SSMS GUI, there is good news.  SQL Server Management Studio offers a fairly straightforward method for backing up a database. Once you connect to the SQL Server instance containing the database you want to back up, left click on the “+” sign next to the Databases folder.  Then right click the name of the database you want to backup, choose Tasks, then choose the Backup option in the fly out menu.

If you are a DBA, it is really important to know how to take and restore backups using T-SQL or Powershell only. Especially restoring backups—in the highest-pressure situations, waiting for the UI to load can be painful.

Comments closed

The Problem with VM Backups of SQL Server

Sean Gallardy turns a problem on its head:

Now let’s get to the main point, which is how long the VM stays paused or stunned – remember, this is a “small” or “short” amount of time, one might even say “trivial”. When it is kept this short to where it’s “trivial” as in less than a second then all is good and you most likely won’t notice it except in very high workloads… but we should be running with VSS integration and not VM level so it’s still incorrect, but hey. When this time is not short of trivial then GOOD things start to happen, most notably that high availability kicks in.

I appreciate the framing of this post, as the failover wasn’t a problem; it merely exposes the actual problem.

Comments closed

The Cost of Verifying Backups to Azure

Matt Robertshaw reminds us that TANSTAAFL:

Within two weeks of backups being written to Blog Storage, we observed a significant upward trend in cost associated with a Storage Account.  When compared to the previous month, there was an increase of c. £270.  After some further analysis, we were able to associate this increase with “bandwidth” charges.  This didn’t feel right – you don’t pay anything to upload data to Azure (ingress), you only pay when downloading data from Azure (egress).

Using Azure Monitor, we profiled the ingress and egress rates for the affected Storage Account and noticed the following pattern:

Each day, c. 150GB of backups were being written to blob storage (in blue), but shortly after, the same amount was being downloaded (in red).  Over this period, we calculated 4TB of data had been uploaded and then downloaded again.  Based on Microsoft’s latest Bandwidth pricing, whilst the first 5GB of egress per month is free, the next 5GB – 10TB is charged at £0.065 per month.  Some simple maths confirms it to be the additional £270 we observed.

Read on for three possible solutions. My preference for an on-prem solution would be to verify locally and then push to Blob Storage / S3. Backups tend to be faster that way as well, as your disk is likely faster than your network.

Comments closed