Press "Enter" to skip to content

Author: Kevin Feasel

Logging Database Scoped Configuration Changes

Jonathan Kehayias wants to know what changes you’ve made:

The introduction of DATABASE SCOPED CONFIGURATIONS in SQL Server 2016 enabled different configuration settings at the individual database level. However, there is no logging of changes to the database scoped settings by default in SQL Server, making it nearly impossible to track down when a change was made and by who. After recently working on a client problem where performance issues were attributed to a DATABASE SCOPED CONFIGURATION of MAXDOP = 1 multiple times, I decided to create a DDL trigger for the ALTER_DATABASE_SCOPED_CONFIGURATION events in SQL Server to have it log the change to the ERRORLOG file similar to the one I wrote years ago for logging Extended Event session changes.

Click through for the definition of that trigger.

Comments closed

Dynamic Data Masking and Granular Unmasking

Dennes Torres points out a change to dynamic data masking in Azure SQL DB:

Dynamic data mask is a very interesting security feature allowing us to mask critical fields such as e-mail, phone number, credit card and so on. We can decide what users will be able to see the value of these features or not.

This feature faced many flaws when it was released, but I believe it’s stable now, although It’s not the main security feature you should care about, it can still be very useful.

However, until very recently, this feature was not very useful. If you mask many fields in many different tables, the fields may require different permission levels in order to be unmasked.

I agree that this is definitely not a security feature. But hey, at least it’s a bit more useful than it was before.

Comments closed

Storytelling with Data Book Review

Camila Henrique has a book review for us:

Hello! As you may have noticed from my Reading List page here, I like to read. Recently, with the new job, I was looking for a book that talked about Data Visualization. While searching, I came across “Storytelling with Data”, and it was not the first time I saw it. After checking a few reviews, I decided to invest my time reading it. Turns out it was a great decision! I liked it so much that I wanted to talk about it here, so here it comes, grab your reading glasses.

This has been on my backlog of books to review, and I agree with Camila that it’s absolutely worth grabbing a copy.

Comments closed

BETWEEN and Overlaps

Chad Callihan reminds us that BETWEEN is inclusive of both sides:

Thanks to Robert for his comment on the last post that then spawned this post. In the example about sargable dates, I thought I would go with the more simple look and only use dates instead of adding the times. The point is to look at sargability, right? Well, here’s an example on why you don’t mix and match dates and datetimes.

Click through for the demonstration.

Comments closed

MMLSpark Is Now SynapseML

Mark Hamilton has an announcement:

Today, we’re excited to announce the release of SynapseML (previously MMLSpark), an open-source library that simplifies the creation of massively scalable machine learning (ML) pipelines. Building production-ready distributed ML pipelines can be difficult, even for the most seasoned developer. Composing tools from different ecosystems often requires considerable “glue” code, and many frameworks aren’t designed with thousand-machine elastic clusters in mind. SynapseML resolves this challenge by unifying several existing ML frameworks and new Microsoft algorithms in a single, scalable API that’s usable across Python, R, Scala, and Java.

Read on to learn more about the library.

Comments closed

Clear out Those Old Container Images

Joy George Kunjikkur has a public service announcement for us:

When we use self-hosted Azure pipeline agents, we may encounter the below issue during the build process. This is not a hard issue to troubleshoot. The reason is there in the error message.

Error processing tar file(exit status 1): open /root/.local/share/NuGet/v3-cache/670c1461c29885f9aa22c281d8b7da90845b38e4$ps:_api.nuget.org_v3_index.json/nupkg_system.reflection.metadata.1.4.2.dat: no space left on device

This is known in the industry as a whoopsie-doops. Click through to see what you can do to resolve the problem.

Comments closed

Documenting Power BI Dataset Measures

Gilbert Quevauvilliers thinks about documentation:

One thing that often happens is when users are using a dataset, they want to know which measures are available. And not only that sometimes they want to know the measure definition.

This got me thinking and how best could I give this to the users in my organization to be able to find this information quickly and easily.

In the past this was a manual effort not only to export the measures, but also to maintain a document, so that as measures are added, updated, or deleted I would then need to manually update some document.

Yep, you guessed it I created a Power BI report which has got all the measures and their measure definitions, which will update with the dataset! And I show you how I did this below.

Click through to see how.sfff

Comments closed

Saying No to NOLOCK

Brent Ozar just says no:

When you put NOLOCK in your query, SQL Server will:

– Read rows twice

– Skip rows altogether

– Show you changes that never actually got committed

– Let your query fail with an error

This is not a bug. This is by design. 

There are reasons why you might want to use NOLOCK, but start with no and you’ll be in better shape.

Also, remember that NOLOCK really means “No, lock!”

Comments closed

Managing File Retention in Blob Storage

Jeet Kainth shows how to configure a retention strategy in Azure Blob Storage:

This blog is a follow up to a previous blog I wrote about backing up Azure Analysis Services cubes in Azure, that blog can be found here. This blog shows how to implement a retention policy using PowerShell in Azure Runbooks to remove the backups after a set number of days. To create a new Runbook in the Azure portal, go to the relevant Automation account in the relevant resource group and then select Runbooks from the left hand pane. Note you will need to add the Az.Storage module to the automation account to be able to use some of the commands listed in this blog.

Click through for the process, including Powershell code to perform the task.

Comments closed