We know some of you really hate linked servers so we have been working on a powershell collection which will allow you to install the inspector without using linked servers to centrally log the information and instead the powershell function Invoke-SQLUndercoverInspector will do the rest for you (We will be writing a blog post about how you can use this soon) – this is currently a pre-release version so it’s a work in progress – I must say a massive thank you to Shane O’Neill (b | t) without his powershell skills this wouldn’t turned out as well as it has, thanks Shane!
If you’ve already downloaded this version, be aware that there is a hotfix.
Andy Levy has a problem. Well, about 8000 of them. In part 1, he describes the plan:
How do you move eight thousand databases in a reasonable amount of time? I spent about an hour and a half one morning hashing ideas out w/ folks in the dbatools Slack channel, plus several conversations in the office and with our hosting provider.
We missed the estimated time for our go/no-go decision by five minutes. With the number of moving parts, databases in play, unexpected delays, and amount of testing we had to do, that’s pretty good! My colleague and I had some additional work we needed to take care of after the team declared the migration a success. Agent jobs needed to be enabled, overnight job startups monitored, things like that. We called it a day after about 14 hours in the office.
It was a nice success story, so check it out.
PowerShell DSC is a platform to support the concept of Infrastructure as Code (IaC). It uses declarative syntax instead of the usual imperative syntax of PowerShell. This means that you describe your desired state rather than the specific steps needed to get there. There are two modes for DSC, push and pull, although pull mode offers more features and scalability, we’ll look at writing our configuration and using push mode for this blog post to keep it simple.
This post covers initial installation and some of the initial configuration, so check it out if you’re new to DSC.
My answer to that is simple, I’m a major contributor to the awesome Powershell library dbatools. What I’ve contributed to that library are commands that will help automate the running and processing of queries from the DMV library of Glenn Berry
At some point in the life of a DBA we’ve all come accross his scripts. For the longest time I would advise people to google “Glenn Berry DMV”, and it will be the top result.
The scripts however, come in a single file per SQL Server version and you can’t run them all in one go. You would have to select a script, run it, and paste the result from Management Studio into an Excel sheet. Glenn provides an empty sheet with tabs ready to paste the various result sets in. I’ve automated this part, hope you like it!
Click through for a demonstration of this cmdlet and the useful output it generates.
First things first, you’ll want to choose your version of SQL Server. Python is available on 2017 and greater. For this demo I’ll be using SQL Server 2019 Developer Edition (CTP 2.2).
With 2019 CTP2.2 they’ve increased the requirement of your OS too, in my example I had a spare VM with Windows Server 2012 laying around but I needed to update this to Server 2016. Check the relevant documentation for the version you’re using.
Click through for a step by step guide with plenty of screenshots.
Recently, Kevin Hill (b | t ) posted on getting package errors from the SSIS catalog in a single query as opposed to clicking through the SSIS Reports and digging through pages. I took that and ran with it a little bit. The first pass needed an additional index on the catalog to increase performance. Kevin’s included that at the bottom of his query on the post above. (You probably don’t need the included “message” column, though.)
I wanted to take this and run with it a little bit to report on all errors for a given folder within the last day, then e-mail that in an HTML formatted e-mail. To that end, I wrote up a quick stored procedure that should take the Folder or Package or Project name and a “to” e-mail address to send an e-mail through DBMail.
Click through for the script.
INSERT failed because the following SET options have incorrect settings: ‘ARITHABORT’.
Verify that SET options are correct for use with
and/or indexes on computed columns
and/or filtered indexes
and/or query notifications
and/or XML data type methods
and/or spatial index operations.
[SQLSTATE 42000] (Error 1934). The step failed.
Immediately I started looking at my creation scripts for the tables. Nope, no XML indexes, no spatial indexes, no filtered indexes, no indexes on computed columns (not even any computed columns), and no query notifications.
In Jason’s case, it was an oddity around SQL Agent jobs, but there are a few reasons this could pop up, and Jason explains some of the most common.
Imagine a situation when you are tasked to install a dozen SQL Server instances. Using an approach described in previous blog post installation can take a significant amount of resources and time, also there is a probability of errors due to typos (e.g. in directory paths, etc.)
There are two options that can be used for the installation:
1. Using ConfigurationFile.ini
2. Using command line parameters to setup.exe file
Below there are step-by-step instructions on how to install SQL Server using both options.
If you only have a couple of SQL Server instances in your environment, unattended installation may seem a bit overwhelming. But as you have to deal with larger and larger numbers of instances, it’s the only way to keep your sanity. It’s also the only way to install on Windows Server Core.
The LOGINFO command returns a row for each Virtual Log File in the Transaction Log. The FileSize column shows the size (KB) of each individual VLF. The amount of the transaction log that is currently in use can be determined by the number rows with a Status of 2 (0 = Inactive, 2 = Active & 4 = VLF exists on the Primary replica, but not on the Secondary replica).
One of the biggest problems with the LOGINFO command has been the difficulty of extrapolating the information to include all databases on the server, for use in reports, automated alerts and maintenance processes. Beginning with SQL Server 2016 SP2, there are 2 new Dynamic Management Views (DMVs) that can be used for monitoring the VLF information: sys.dm_db_log_info & sys.dm_db_log_stats.
A huge number of VLFs can be a hidden performance sink.
I have already blogged about the first version of the Database Scoped Configurations for SQL Server 2016, with 4 visible optionsplus the procedure cache cleaning option, but we have followed in SQL Server 2017 with 5 (listed) & 9 (in practice – DISABLE_INTERLEAVED_EXECUTION_TVF, DISABLE_BATCH_MODE_ADAPTIVE_JOINS, BATCH_MODE_MEMORY_GRANT_FEEDBACK, BATCH_MODE_ADAPTIVE_JOINS are visible and functioning), and in just another year we have received a huge upgrade to the currently available 21 for SQL Server 2019.
It seems like this is a common route the SQL Server teams are going down, and it makes sense: your settings for Mega-DB probably shouldn’t be the same as for the tiny database in the corner. Oh, and that whole Azure SQL Database thing.