Press "Enter" to skip to content

Category: Administration

Proposed Max Server Memory Defaults

Randolph West has a proposal for default max server memory on a SQL Server instance:

As noted in the previous post in this series, memory in SQL Server is generally divided between query plans in the plan cache, and data in the buffer pool (other uses for memory in SQL Server are listed later in this post).

The official documentation tells us:
[T]he default setting for max server memory is 2,147,483,647 megabytes (MB).

Look carefully at that number. It’s 2 billion megabytes. In other words, we might think of it as either 2 million gigabytes2,048 terabytes, or 2 petabytes.

Randolph is writing this like we don’t all have multiple petabytes of RAM on each machine.

Comments closed

Power BI Connection String Capitalization

Slava Murygin ran into a problem with capitalization on connection strings with Power BI Report Server:

The message actually says:
“Several errors occurred during data refresh. Please try again later or contact your administrator.”
SessionID: 1b80301e-3898-417a-af9c-2e77ec490728
[0] -1055784932: Credentials are required to connect to the SQL source. (Source at SQLServerName;DBA_Pro.). The exception was raised by the IDbCommand interface.
[1] -1055784932: The command has been canceled.. The exception was raised by the IDbCommand interface.
[2] -1055784932: The command has been canceled.. The exception was raised by the IDbCommand interface.

In my case the cause of the problem was very silly thing. PowerBI Server assigned only one data source connection string to my report, while in my report I had two data sources with only the difference in a Database Name capitalization:

This was a weird scenario.

Comments closed

Automating Log File Expansion

Max Vernon shows how you can automatically expand log files to optimize VLF counts:

SQL Server Database Log file expansion can be fairly tedious if you need to make the log bigger in many reasonably-sized growth increments. It can be tedious because you may need to write and execute a large number of ALTER DATABASE ... MODIFY FILE ... commands.

The following code automatically grows a SQL Server Database log file, using the size and growth increments you configure in the script. If you set the @DebugOnly flag to 1, the script will only print the commands required, instead of executing them. This allows you to see what exactly will be executed ahead of time. Alternately, you could copy-and-paste the commands into a query window and execute them one-by-one.

Click through for that code.

Comments closed

Dropping Tables in Bulk

Jeff Mlakar talks about a topic I like—dropping lots and lots of stuff:

Let’s assume that you have lots of tables that need to be dropped according to some criteria. Trying to do them all at once isn’t a good idea. Even with a powerful server it will either take forever or simply never finish.

For example – you may have millions of tables in sys.tables or millions of indexes you need to drop. SQL Server won’t process them well if you try to run it as one big statement.

I’ve never had millions of tables or millions of indexes to drop and now I am jealous. Regardless, Jeff has two techniques for us when you have a lot of work to do. And if you do need to figure out key dependencies, I have a script for that.

Comments closed

Enabling Large Memory Pages in SQL Server

David Klee talks us through large memory pages:

SQL Server Enterprise Edition can leverage large memory pages to reduce the amount of memory pointers required for larger SQL Server deployments. Reducing the number of pointers makes the database engine more efficient, especially for SQL Servers with greater than 32GB of RAM. A normal memory block is 4KB, and many thousands of pointers are required to manage the memory underneath a larger SQL Server. Large memory pages can change the block size to 2MB, greatly reducing the number of pointers required for memory management.

Read on to see what effect this has, as well as when to use them and—more importantly—when not to use them.

Comments closed

A Primer on RoboCopy

John Morehouse takes us through a venerable file copying tool for Windows:

Robocopy has been around for years within the Microsoft eco-system and it is highly versatile.  However, until recently, it wasn’t a tool that I was versed at and frankly, hardly used it.  Over the past year or so, however, I have found myself using it more and more as a solution to for file movement when needed.

Essentially, robocopy will move files or even directories from one location to another.  It will copy permissions without any additional coding, and it will only copy the files that do not exist within the destination.  This is useful in that you do not have to explicitly account for files that might already exist at the destination.  It handles it for you.

Read on to see more, as well as a demo of RoboCopy in action.

Comments closed

Considerations Before Failing Over an AG

David Fowler points out a few things to look at before failing over an Availability Group:

What about your SQL Agent jobs? Have you got jobs that perform actions on your data? If you have, do those jobs exist on the new primary? If they don’t then I’m happy to bet that whatever function that they were playing probably isn’t happening anymore.

One thing that I always want to make sure before I failover is, do I have all the relevant jobs ready to roll on the secondary server?

But what about the jobs that you’ve got on the old primary? There’s a fair chance that, if they’re doing any sort of data manipulation, they’re going to be failing.

Click through for additional considerations.

Comments closed

Why Optimize for Ad Hoc Workloads

Randolph West explains why optimize for ad hoc workloads should be enabled by default:

Enabling the optimize for ad hoc workloads configuration setting will reduce the amount of memory used by all query plans the first time they are executed. Instead of storing the full plan, a stub is stored in the plan cache. Once that plan executes again, only then is the full plan stored in memory. What this means is that there is a small overhead for all plans that are run more than once, on the second execution.

Read the whole argument. I don’t know that I’ve seen an instance yet where this setting was a really bad choice.

Comments closed

Orphaned Users in SQL Server

Dave Bland walks us through one way to fix an orphaned user:

In my many years of working as a DBA, I have encountered many disabled logins.  However, I have never really encountered what looks to be a disabled database user account.  I didn’t even think it was possible to disable a user account in a SQL Server database.  I checked the user account properties just to makes sure I was correct.  Sure enough, no option to disable a user account. This finally turned out to be a simple case of looks can be deceiving.

You can also use the sp_change_users_login procedure to fix orphaned users.

Comments closed