Press "Enter" to skip to content

Category: Administration

SQL Server In Containers

Andrew Pruski shows how to install Docker on Windows Server 2016 and pull down a SQL Express container:

But what about connecting remotely? This isn’t going to be much use if we can’t remotely connect!

Actually connecting remotely is the same as connecting to a named instance. You just use the server’s IP address (not the containers private IP) and the non-default port that we specified when creating the container (remember to allow access to the port in the firewall).
Easy, eh?

Containers are great, though I do have trouble wrapping my head around containerized databases and have had struggles getting containerized Hadoop to work the way I want.

Comments closed

Get-DbaTcpPort

Steve Jones looks at one Powershell function inside dbatools:

I like using PoSh for some tasks, especially when I don’t have an easy way to do something in SSMS or want to run a task across a variety of instances. In this case, as I glanced through the September updates, I found a good one.

Get-DbaTcpPort

I don’t love the mixed naming, and I’ll get used to it, but I do love the autocomplete in PoSh.

Steve has lots of screenshots walking you through this function.

Comments closed

Run And RunOnce Registry Key Limits

Denny Cherry runs into a limit in the Run and RunOnce registry value lengths:

Microsoft has had the registry keys for Run and RunOnce in the registry since the registry was introduced in Windows 95 and Windows NT 4.  But in the 20+ years that those keys have been there (and I’ve used them for a variety of things) I’ve never known that there was a limit on the length of the commands that you could put into those keys.

I found this while working on a client project when I needed to kick off some powershell automatically when the server restarted to get it added to the domain, as well as do a few other things.  But for some reason the key just wasn’t running.

The limit does seem a bit short, though at least it’s one longer than the max length of a file path.

Comments closed

Capturing SSAS Query Activity

Bill Anton explains why and how he captures query activity by user in SSAS:

In most environments, it is trivial to obtain the name of the user who ran each query… all you have to do was capture the [QueryEnd] event in a profiler/xevent trace and pull the information from the [NTUserName] field. However, in environments involving Power BI and the Enterprise On-Premise Data Gateway, there’s a bit more to it.

The main issue is how authentication is handled in this type of architecture. When working with Power BI reports connected to an on-premise data source via the On-Premise Data Gateway, the account of the user running the report is passed as the “EffectiveUsername”. The implication here is that the value shown in the [NTUserName] field of the xevent/profiler trace is going to be the Data Gateway account – NOT the account of the user who actually generated the activity.

Read on for the full answer.

Comments closed

Query Store Filegroups

Kendra Little links to a Connect item:

Can you change the filegroup where Query Store keeps its data?

I thought there might be a trick to use a different filegroup for Query Store by using the default filegroup setting in SQL Server before enabling it, but NOPE!

Please vote for this to be improved in this Connect Item.

I concur; Query Store can grow to be pretty large on busy systems, so diligent DBAs who want to keep PRIMARY as small as possible will suddenly find a multi-gigabyte Query Store slowing down those PRIMARY filegroup restores.

Comments closed

Using SQL Server DMA

Arun Sirpal has a series on using the new Data Migration Assistant.  Part 1:

It “enables you to upgrade to a modern data platform by detecting compatibility issues that can impact database functionality on your new version of SQL Server. It recommends performance and reliability improvements for your target environment. It allows you to not only move your schema and data, but also uncontained objects from your source server to your target server”. It can be found at this link:https://www.microsoft.com/en-us/download/details.aspx?id=53595.

Part 2 is all about performing a migration:

By the way the backup file created via the tool is temporary, after a migration it is deleted. Also the compatibility level DOES NOT change, you need to do this yourself.

I haven’t used this tool yet, but it does look like an upgrade to the old Upgrade Advisor.

Comments closed

Perfmon And SQL Server Memory

Lonny Niederstadt looks at using Perfmon to understand what’s happening with memory allocations on your SQL Server instance:

Lets look at stolen memory a bit.  The relationship between memory grants and stolen memory is probably the least intuitive relationship.  Remember – if a query gets a memory grant the grant happens at the beginning of query execution.  Its just a promise of sort/hash memory to be made available when the query needs it.  The grant memory isn’t stolen immediately – rather its stolen in small allocations over time as needed by the query.

In the graph immediately below, the outstanding grants are shown over time.  There are no pending grants during the observation period.  Granted memory and reserved memory are both shown as areas, with reserved memory in front of granted memory.  Granted memory is consistently greater than reserved memory (in this case, no resource pools have been added beyond the pre-existing default and internal pools).  This is how we can determine that the reserved memory is granted memory which hasn’t been stolen yet.

This is a great explanation of what stolen memory is and why it’s important.

Comments closed

Alert On SQL Jobs Missing Schedules

Brian Hansen wraps up a three-part series on scheduled job alerts:

The first two parts of this series addressed the general approach that I use in an SSIS script task to discover and alert on missed SQL Agent jobs. With apologies for the delay in producing this final post in the series, here I bring these approaches together and present the complete package.

To create the SSIS, start with an empty SSIS package and add a data flow task. In the task, add the following transformations.

Regardless of how you do it, knowing when jobs fail is important enough to build some infrastructure around answering this question.

Comments closed

UDL Files To Test Connectivity

Marek Masko shows how to test a database connection without having any database tools:

UDL extension stands for Universal Data Link. These files are used by Data Link API which exposes a user interface to create and manage OLE DB connections. This functionality was introduced in Windows OS at least in Windows 95, maybe even earlier. That means you can use it on every Windows machine you work on. You no longer need to worry about additional tools.

Sometimes you need a creative solution to a policy-induced problem.

Comments closed

Filegroups And RTO

Raul Gonzalez explains the importance of filegroups in minimizing RTO:

So if we don’t create additional filegroup[s] in our databases not only all the data will go to the same logical container but also in case we have to restore that database from a backup, we will have to wait until all of it it’s restored.

Imagine you have a lot of historical data for instance and there is a disaster, if you had different filegroups, one for current data and another for the historical, you would be able to get your live data first and quickly (to get you up and running), and then restore all the historical which is not critical.

To show you how, I’m going to create a database with different filegroups so you’ll see how we can do.

Click through for the scripts, as well as more information.

Comments closed