So how do you restore from Azure storage? You do so from an URL. Let’s take a look!
When you backup a database to Azure, there are two types of blobs that can be utilized, namely page and block blobs. Due to price and flexibly, it is recommended to use block blobs. However, depending on which type you used to perform the backup will dictate how the restores are performed. Both methods require the use a credential, so that information will need to be known before being able to restore from Azure.
Click through for examples using both page blobs and block blobs.
The thing about DBA Myths is that they are generally widespread and widely believed. At least I believed this one until I posted What’s a differential Backup?and Randolph West (b/t) pointed out that my belief that differential backups can’t get larger than full backups was incorrect. In fact, differential backups (like FULL backups) contain enough transaction log information to cover transactions that occur while the backup is taking place. So if the amount of data that needs to be backed up combined with transactions requires more space than just the data ….
Read on for a demonstration.
This week I had the opportunity to learn how to use dbatools to automate backups, change recovery models, and discover additional dbatools commands from dbatools contributor Jess Pomfret.
The final tip I had for Bert was how to use
Find-DbaCommandto help him find the commands he needed to complete his tasks.
A lot of the commands have tags, which is a good way to find anything relating to compression.
That was a nice collaboration.
An error message has started appearing in the SQL Server Error Logs during a nightly full backup.
Could not clear ‘DIFFERENTIAL’ bitmap in database ‘Database1’ because of error 9002. As a result, the differential or bulk-logged bitmap overstates the amount of change that will occur with the next differential or log backup. This discrepancy might slow down later differential or log backup operations and cause the backup sets to be larger than necessary. Typically, the cause of this error is insufficient resources. Investigate the failure and resolve the cause. If the error occurred on a data backup, consider taking a data backup to create a new base for future differential backups.
Click through for the root cause and solution.
Since a full backup reads every data page it makes sense that we can ask SQL Server to calculate and compare each checksum during this operation. Even a differential backup reads all of the pages for any extent (a group of 8 data pages) if any one page in it changes so it will validate some pages that didn’t change.
Read on for a demonstration.
We are on the home stretch and I have saved the last third of the series to cover some pretty cool/amazing commands. Not that all 500+ commands are not cool/amazing but these last few are used continuously in my environment and bring automation closer to your finger tips with PowerShell and SQL Server then ever before.
Have you ever been asked to take a database backup before a developer does a deployment? Have you ever been asked to backup a database to restore it to a development or other environment? How about, hey can you take a quick transaction log backup? These can all be accomplished using Backup-DbaDatabase and only changing a couple of parameters each time.
This post is strictly about taking backups, but dbatools is also great about testing backups.
It’s surprisingly difficult to get this information in SQL Server. In fact I was quite stuck trying to figure out how to get this information when I realized that the good people over at Brent Ozar Unlimited already do some checking on this for their sp_Blitz tool.
A quick look at the above code showed me that
dbo.sysssispackageswas what I was looking for. Combine this with:
- 1. Some hack-y SQL for the frequency in human readable format, and
- 2. Some even more hack-y SQL to join on the SQL Agent Job name
And we had pretty much the XML for the Maintenance Plan and the SQL Agent Job Schedule that they were against.
Shane has made the code available as well, so check it out if you have any maintenance plans you’re trying to understand and maybe even get away from.
I was recently performing an in-place upgrade of SQL 2008 R2 to SQL 2014 on one of my client’s servers. I have done a ton of successful and uneventful in-place upgrades and was surprised when the upgrade failed with the error message: “Failed to create a new folder ‘X:\SQLBackups’. The specified path is invalid (for example, it is on an unmapped drive).” This client had over the years changed from using a local drive for all backups to having backups sent to a network share. So, the X drive really was no longer in existence.
Read on for the solution.
Currently, the output from the following commands is supported:
You will run the above commands as you would normally do but pipe the output to
ConvertTo-DbaTimeline, the same way as you would with any other
ConverTo-*PowerShell function. The output is a string that most of the time you will save as file using the
Out-Filecommand in order to open it in a browser.
Then, with the
ConvertTo-DbaTimeline cmdlet, you can convert that into an HTML page which looks pretty good.
I have a client that uses a lot of disconnected log shipping on a few servers. They do this because they want to have a copy of their database that is actually hosted by software vendors for reporting purposes. So, disconnected log shipping it is! We pull down files that have been FTP’d to the corporate FTP site and apply those logs locally and leave the databases in standby so that they can be queried.
(If you want to review how to set up disconnected log shipping complete with FTP download scripts, all SQL jobs and log shipping monitoring, then hop on over to https://www.sqlrx.com/setting-up-disconnected-log-shipping/ to check it out!)
Of course every so often there is a glitch and one or all databases can get out of synch pretty easily. When we receive a notification that tlogs have not been restored in a while, the hunt is on to see what happened and get it corrected. Since I have disconnected log shipping set up, I can’t get a log shipping report from SSMS that will tell me errors and what was the last log applied. So, I made a query that will give me the most recent file that was restored along with the file name, date, database, etc. I can pull back all databases or by uncommenting a line can filter by a single or multiple databases.
Lori also includes a helpful script.