Press "Enter" to skip to content

Day: May 30, 2017

Smart Transaction Log Backup Stats

Parikshit Savjani explains how you can use a new DMV to create smart transaction log backups:

In sys.dm_db_log_stats, you will find a new column log_since_last_log_backup_mb which can be used in your backup script to trigger a transaction log backup when log generated since last backup exceeds a threshold value. With smart transaction log backup, the transaction log backup size would be consistent and predictable avoiding autogrows from transactional burst activity on the database. The resulting pattern from the transaction log backup would be similar to below.

The new sys.dm_db_log_stats DMV looks to be quite useful.

Comments closed

Where-Object Versus Where Method

Adam Bertram explains the difference between the where-object and the where method in Powershell:

The Where-Object command is a sort of generic filtering command. Use this command to filter any kind of object in PowerShell. The Where-Object command has a FilterScript parameter, which is a scriptblock that allows the user to place code in it. If this scriptblock contains code that returns anything but $false, $null, or an empty string, it will allow whichever object the user passes to it.

For example, let’s say I’ve got a folder full of files. I’d like to see only text files and only those text files modified today. To make this happen, I can use the provider-specific filter with the Get-ChildItem command and also the Where-Object command.

Read on to see how that compares to the where method.  Given the latter’s limitations, I’ll probably stick to where-object anytime performance is not critical.

Comments closed

Versioning R Code In SQL Server

Steph Locke shows how to combine R models and SQL Server temporal tables for versioning:

If we’re storing our R model objects in SQL Server then we can utilise another SQL Server capability, temporal tables, to take the pain out of versioning and make it super simple.

Temporal tables will track changes automatically so you would overwrite the previous model with the new one and it would keep a copy of the old one automagically in a history table. You get to always use the latest version via the main table but you can then write temporal queries to extract any version of the model that’s ever been implemented. Super neat!

I do exactly this.  In my case, it’s to give me the ability to review those models after the fact once I know whether they generated good outcomes or not.

Comments closed

NULL Values In The Histogram

Taiob Ali explains how NULL values show up in the SQL Server histogram when you create statistics:

In the density_vector section ‘All density’ value for column ‘PickingCompletedWhen’ is 0.0004705882 which was calculated from: 1/(Number of distinct values of column ‘PickingCompletedWhen’).
In this case which is 1/2125. All NULL values were considered as one. If you do a count of distinct values you will get a result of 2124. Reason is explained here. If you do a select of all distinct values NULL will show along with other 2124 values.

Taiob explains that there’s really nothing special about NULL when it comes to statistics.

Comments closed

Extended Event Handler Application

Dave Mason whipped up an application to track when a particular extended event fires:

Event Responses: when an event is handled, specify how you want to respond to it. In the “Event Responses” section, there are two options: play a system sound or send email (you must choose at least one option). Pick the system sound desired and/or enter an email address.

Dave has made the source code available as well so you can extend the functionality.

Comments closed

Re-Running Powershell History

Claudio Silva shows how to re-run statements in your Powershell history:

While we were running some commands we talked about Get-History cmdlet.

For those who don’t know

this cmdlet lists all the commands that you already ran on the current session.

I wondered if it is possible to pipe the Get-History output and run it again. I could bet yes but I never tried before.

The answer is that yes, you can, and Claudio shows how.

Comments closed

Availability Group Backup Preferences

Shaun Stuart points out that the Backup Preferences tab of the Availability Group Properties for an AG is a little tricky:

The default, and the way my AG was configured, was Prefer Secondary. As the image shows, this means backups will be made on the secondary, unless the secondary is unavailable, in which case, they will be made on the primary.

There are a couple of things to note when you use this setting:

  1. Full backups made on the secondary are Copy Only backups. This means they won’t reset the differential bitmap and your differentials will continue to increase in size until a full backup is made on the primary.

  2. Differential backups cannot be made on the secondary.

  3. Transaction log backups can be made on the secondary and they do clear the log, so your log file will not continue to grow.

Read on for more details.

Comments closed

Columnstore Performance Counters

Niko Neugebauer talks about perfmon counters available for understanding what’s going on with columnstore indexes:

As mentioned right in the beginning of this article, Sunil Agarwal lead the development team into adding greatly valuable performance counters with it’s own object ‘MSSQL:Columnstore‘ that provides some incredible insight on some of the internal operations that are not exposed in other ways.
This was a very much needed step, because SQL Server 2014 has brought a lot of different performance counters and objects for the In-Memory (XTP), while Columnstore Indexes were deserving a good treatment of their own.

This gives you one more avenue for research if you’re experiencing columnstore-related issues.

Comments closed