Press "Enter" to skip to content

Category: Administration

Rebuilding a Transaction Log

David Fowler fixes a large-scale oopsie:

“Could you help me, we deleted the database’s transaction log file and now that database is stuck in ‘Recovery Pending’?”

This was a panicked call that I received a few weeks ago.

“Sure, no problem” said I, “we’ll have to restore back to your last backup”

And then things went silent for a while before the inevitable, “it’s only a development database, we don’t take backups”.

I can feel the face-palm from here. Read on to learn what you can do if you’re in that situation, as well as David’s important note about taking backups so that you don’t end up in this situation to begin with.

Comments closed

Cloud Governance via Azure Policy

Alexey Nazarov draws a line:

Azure Policy is a service that allows you to create, assign, and manage policies that govern your Azure resources. Policies are rules that define the desired state and configuration of your resources, such as the location, size, tags, and properties. Policies can also audit the compliance status of your resources and report any violations.

With Azure Policy, you can ensure that your resources follow the best practices and standards that you define for your organization. You can also use Azure Policy to implement cost management, security, and regulatory compliance for your cloud environment.

Click through to learn more about Azure Policy. My limited experience with it is that the idea is sound, though there are some limitations in what you can do that can make things annoying.

Comments closed

Automated Patching in SQL Server on Azure

Abdullah Mamun closes a door and opens a window:

The automated patching feature for SQL Server on Azure VM will be retired on September 17th, 2027.

While Automated Patching is being retired, we’re excited to replace it with Azure Update Manager. Azure Update Manager is an enterprise class powerful tool that provides:

  1. Centralize Update Management: Azure Update Manager provides a unified dashboard where you can view and manage updates across your entire environment, including virtual machines, on-premises servers, and even hybrid scenarios.
  2. Customize Schedules: With Azure Update Manager, you can create custom update schedules based on your organization’s needs. Whether you prefer weekly, monthly, or specific date-based updates, Azure Update Manager has you covered.
  3. Patch Compliance Reports: Azure Update Manager generates detailed reports on patch compliance, helping you stay informed about the status of updates across your infrastructure.

Click through to learn more about the transition and next steps.

Comments closed

Checking Cumulative Update Status on SQL Server Instances

Steve Jones reminds you to check those cumulative updates:

How can I quickly get a CU patch for a system that’s out of date? I’ll discuss that situation.

You might think you get to patch every instance every few months, and you may be able to. But most of us have laggards in any decent-sized estate. Someone always wants to avoid patching, or skip patching on the day you’ve scheduled every other system.

Steve’s solution involves using Redgate Monitor, though you could also do this on your own or using something like dbatools to get information about your estate.

Comments closed

External References in Data-Tier Applications

Andy Brownsword needs to make a call out:

One method for transferring a database to a different environment is using a Data-Tier Application – in the form of a DACPAC (for schema) or BACPAC (for schema and data).

Trying to use this approach with multi-database solutions is a challenge though as Data-Tier Applications don’t play nicely with cross-database objects.

Let’s look at how we can ease that pain.

Read on for the solution.

Comments closed

A Reminder for Server Consistency

Chad Callihan resolves an issue:

I connected to the latest SQL Server, opened SSMS, and tried to restore from there. Sure enough, I was presented with the error:

Cannot access the specified path or file on the server. Verify that you have the necessary security privileges and that the path or file exists.

If you know that the service account can access a specific file, type in the full path for the file in the File Name control in the Locate dialog box.

Read on for the solution, which was easy enough, but serves as a reminder that having (and occasionally running!) idempotent configuration scripts can be quite useful.

Comments closed

Finding Basic Table Information via T-SQL

Andy Brownsword has a script for us:

In Management Studio we can view object details by hitting F7 in Object Explorer. It gives us basic metrics but I find it very slow to load for the details I typically need.

For that reason I though I’d share a script to turn to for metrics I commonly need. This query returns:

  • The table details (schema, name, created date)
  • The primary storage (Heap, Clustered, or Columnstore)
  • The numbmer of Nonclustered / Columnstore Indexes
  • The number of records and rough size for data / indexes

Click through for the script and an example of what it looks like.

Comments closed

Data Compression and CPU Utilization

Kendra Little shares some advice:

Every time I share a recommendation to use data compression in SQL Server to reduce physical IO and keep frequently accessed data pages in memory, I hear the same concern from multiple people: won’t this increase CPU usage for inserts, updates, and deletes?

DBAs have been trained to ask this question by many trainings and a lot of online content – I used to mention this as a tradeoff to think about, myself– but I’ve found this is simply the wrong question to ask.

In this post I’ll share the two questions that are valuable to ask for your workload.

Kendra’s advice is very good, and to add my own two cents to the mix: the last place I was at did, in fact, see a pretty reasonable reduction in CPU utilization by performing page-level compression on any index where it made sense—and this was a very busy OLTP environment. The exceptions would be indexes making prominent use of things like Guids or chunks of binary, which don’t compress very well at all. In all my FTE and consulting years, I’ve never run into a circumstance in which compression caused a significant gain in CPU utilization.

Comments closed