Press "Enter" to skip to content

Author: Kevin Feasel

Operating System Error 3

Stacy Brown provides common reasons for why you might get Operating System Error 3:

Sometimes the users of SQL Backup Master may face the following error while executing the database backup job:

Msg 3201, Level 16, State 1, Line 1
Job Execution Error: Cannot open backup device ‘’ Operating System error 3 (The system cannot find the path specified.)

Now, there can be the various possible reasons behind the occurrence of this error. Therefore, in the following sections, all possible reason with their respective solutions are discussed. A user can refer them to solve this SQL Server operating system error 3(the system cannot find the path specified.)

Click through for solutions to several potential causes of this error.

Comments closed

Explaining DTUs

Andy Mallon explains what a Database Transaction Unit is:

I’d like to point out that the definition of a DTU is that it’s “a blended measure of CPU, memory, and data I/O and transaction log I/O…” None of the perfmon counters used by the DTU Calculator take memory into account, but it is clearly listed in the definition as being part of the calculation. This isn’t necessarily a problem, but it is evidence that the DTU Calculator isn’t going to be perfect.

I’ll upload some synthetic load into the DTU Calculator, and see if I can figure out how that black box works. In fact, I’ll fabricate the CSVs completely so that I can totally control the perfmon numbers that we load into the DTU Calculator. Let’s step through one metric at a time. For each metric, we’ll upload 25 minutes (1500 seconds–I like round numbers) worth of fabricated data, and see how that perfmon data is converted to DTUs.

Andy then goes on to show how the DTU Calculator estimates DTU usage given different resource patterns.  It’s a very interesting process and Andy clarified it considerably.

Comments closed

Benchmarking

Lukas Eder notes that the best way to compare performance is to…compare performance:

To bust a myth, if you have good reasons to think that a differently written, but semantically equivalent query might be faster (on your database), you should measure. Don’t even trust any execution plan, because ultimately, what really counts is the wall clock time in your production system.

If you can measure your queries in production, that’s perfect. But often, you cannot – but you don’t always have to. One way to compare two queries with each other is to benchmark them by executing each query hundreds or even thousands of times in a row.

Lukas goes on to compare a left join to a correlated subquery in three separate database products and the results are very interesting although absolutely not comparable across products because the DeWitt Clause is still a thing.  Great read.

Comments closed

Unavailable Azure VM Sizes

Melissa Coates gives a few of the major reasons why a particular Azure VM size may not be available when you go to resize your VM:

Just a quick tip about why you might notice some sizes are not available when you are attempting to change the size/scale level of an Azure virtual machine in the portal.

I wanted to change one of my Development VMs to a DS12_v2, but that choice wasn’t available:

It didn’t immediately dawn on me why it wasn’t available, so I thought I’d try PowerShell:

Read on for the solution, as well as a few other common causes.

Comments closed

Your Memory Configuration Could Slow Down Your System

David Klee warns against just throwing memory into servers willy-nilly:

An example of this is if you wish to configure 384GB of RAM on a new server. The server has 24 memory slots. You could populate each of the memory slots with 16GB sticks of memory to get to the 384GB total. Or, you could spend a bit more money to buy 32GB sticks of memory and only fill up half of the memory slots. Your outcome is the same amount of RAM. Your price tag on the memory is slightly higher than the relatively cheaper smaller sticks.

In this configuration, your 16GB DIMM configuration runs the memory 22% slower than if you buy the higher density sticks. Check out page 63 of the server build guide for an HPE Proliant DL380 Gen9 server. The fully populated 16GB stick configuration runs the memory at 1866 MHz. If you only fill in the 32GB sticks on half the slots, the memory runs at 2400 MHz.

Very interesting information.

Comments closed

Cleaning Up LOB

Raul Gonzalez reminds us that rebuilding a clustered index onto a new filegroup doesn’t move LOB data:

In previous posts I have explained how having dedicated filegroups for user data can improve our RTO by, in case of disaster, recovering critical data first and then the rest.

The thing is when you deal with databases which were not created this way, you need to move the data from one filegroup to another before you can apply this kind of techniques.

Here is where this post can show you one of the gotchas you can find during this process.

Read on for a demo of this.

Comments closed

Steps Recorder

Monica Rathbun shows how to use the Steps Recorder in Windows:

Windows PSR “Problem Steps Recorder”.  It’s a nifty tool that helps you trouble shoot a computer problem by recording step by step what the user is doing.

How to:

  • Go to Start

  • Windows Accessories

  • Steps Recorder or Problem Steps Recorder depending on Windows version

  • Select Start Record (App will popup)

That’s a new one on me; click through to check it out.

Comments closed

Converting To Local Time In M

Chris Webb shows how to convert a datetime from UTC to your local time zone using M:

Here’s a brief explanation of what the query does:

  • First it reads the times from the Excel table and sets the Time column to be datetime data type

  • It then creates a new column called UTC and then takes the values in the Time column and converts them to datetimezone values, using the DateTime.AddZone() function to add a time zone offset of 0 hours, making them UTC times

  • Finally it creates a column called Local and converts the UTC times to my PC’s local time zone using the DateTimeZone.ToLocal() function

There are some limitations to what it does, so you can’t convert to just any time zone while still retaining Daylight Savings Time awareness.

Comments closed

Attribute Slicer

Devin Knight continues his Power BI custom visuals series:

In this module you will learn how to use the Attribute Slicer Power BI Custom Visual.  Using the Attribute Slicer you have the ability to filter your entire report while also being able to visibly see a measure value associated with each attribute.

Click through for the video as well as more details.  This looks like a very interesting way of integrating a slicer with some important metric, like maybe including dollar amounts per sales region and then filtering by specific regions to show more detailed analyses.

Comments closed