…recently, I ran into an issue in PowerShell that, if it had been in SQL, I would have solved it quite handily with some Dynamic SQL.
“Alas, this is PowerShell” I thought to myself. “And there is no way that one knows of that one can create dynamic commands that can be built up itself!“.
Now, there is two things that you have to realise for when I’m thinking to myself:
- I think more fancy that I am in real life, and
- I’m nearly always wrong!
So please see below for my example problem and the “dynamic PowerShell” created to overcome the issue!
Check it out, and then imagine how to perform Powershell injection.
Awesome! Our custom image is in our ACR!
But has it worked? Has it really? Oh ye of little faith…
I guess the only way to find out is to run a container! So let’s run a Azure Container Instance from our new image.
Spoilers: it worked.
Two of the products I use extensively for this purpose are Amazon Glacier and, more recently, Microsoft Azure Blob Storage Archive Tier. As happy as I’ve been with Amazon Glacier since its introduction in 2012, I always hoped Microsoft would offer a similar service. My wish came true in Fall of 2017 when an archive tier of Azure Blob Storage was announced. Rather than branding this capability as a new product, Microsoft decided to present it as a new tier of Azure Blob Storage, alongside the existing hot and cool storage tiers.
A noticeable difference from the hot and cool storage tiers is that the archive storage tier is only available on a per-blob basis. While a storage account can be configured to have all blobs placed in either the hot or cool tier by default once they are uploaded, the archive tier is not an option. Once a blob is uploaded, it must explicitly be moved into the archive tier. If one is using the Azure Portal to do this, there’s several clicks involved per blob. The free Azure Storage Explorer client is no better. While I found several third party tools that can upload files to the archive tier, none were free. At this point, I decided to write my own method using Powershell, which I am happy to share below.
Read on for the script. A good use for Azure Blob Storage Archive Tier would be storing old database backups which you have to keep around for compliance purposes but rarely use.
I’ve been using this at work recently and it also relates to the presentation I gave at the ONSSUG June meeting around data compression. The beginnings of this script originated online as I dug into learning about the DMVs that related to objects and compression and then customized for what I needed.
If you run the below as is it will provide basic information about all objects in your database, except those in the ‘sys’ schema, along with their current size and compression level.
Click through for the script or, if your version of dbatools is up to date, call
It’s a super useful feature that not all DBAs know about. Since CMS data is stored in msdb and accessible via SMO, you can access it from SQL Server Management Studio or PowerShell modules like dbatools.
Central Management Server’s essential functionality includes:
Actions that are taken by using a central management server group act on all servers in the server group. This includes connecting to servers by using Object Explorer and executing Transact-SQL statements and Policy-Based Management policies on multiple servers at the same time.
I mostly use it as a visual repository of my SQL Servers. Prior to using dbatools and Invoke-DbaSqlQuery, however, I did use CMS to easily execute code against a number of different servers.
CMS is a great feature, and is a critical tool for scaling out a SQL Server infrastructure.
The premise is simple: it will generate a series of DROP and then CREATE INDEX commands for every index. The process is a little more complex in practice, but at a high level it:
- Creates a special schema to house a temporary object,
- Creates a special stored procedure to run the code,
- Calls said stored procedure,
- Generates a bunch of PRINT statements that serve as the output (along with new line support for readability),
- Cleans up the stored procedure it generated,
- And finally deletes the schema it created.
Click through for the script, as well as a bonus Powershell script. Because hey, it’s only six lines of code.
At least they used to be, before I built the command that started it all: Start-DbaMigration. Start-DbaMigration is an instance to instance migration command that migrates just about everything. It’s really a wrapper that simplifies nearly 30 other copy commands, including Copy-DbaDatabase, Copy-DbaLogin, and Copy-DbaSqlServerAgent.
Also a bonus shout out to dbachecks.
SQL Vulnerability Assessment (VA) is a service that provides visibility into your security state, and includes actionable steps to resolve security issues, and enhance your database security. It can help you:
- Meet compliance requirements that require database scan reports.
- Meet data privacy standards.
- Monitor a dynamic database environment where changes are difficult to track.
VA runs vulnerability scans on your database, flagging security vulnerabilities and highlight deviations from best practices, such as misconfigurations, excessive permissions, and unprotected sensitive data. The rules are based on Microsoft’s best practices and focus on the security issues that present the biggest risks to your database and its valuable data. These rules also represent many of the requirements from various regulatory bodies to meet their compliance standards.
Results of the scan include actionable steps to resolve each issue and provide customized remediation scripts where applicable. An assessment report can be customized for your environment by setting an acceptable baseline for permission configurations, feature configurations, and database settings. This baseline is then used as a basis for comparison in subsequent scans, to detect deviations or drifts from your secure database state.
Read on for more, and if you’re interested, the cmdlets are available in the SqlServer Powershell module.
Now we all know that having SQL Server Agent jobs owned by ‘Real’ users isn’t a good idea. But I don’t keep that close an eye on some of our test instances, so wasn’t surprised when I spotted this showing up in the monitoring:
The job failed. Unable to determine if the owner (OldDeveloper) of job important_server_job has server access (reason: Could not obtain information about Windows NT group/user 'OldDeveloper', error code 0x534. [SQLSTATE 42000] (Error 15404)).
Wanting to fix this as quickly and simply as possible I just wanted to bulk move them to our job owning account (let’s use the imaginative name of ‘JobOwner’).
Click through for both scripts.
To create a new snapshot, you no longer need to know the path of the snapshot location (though we do support custom paths). You don’t even need to specify a name! But you can, of course.
I haven’t used database snapshots in a while, but I do appreciate them, especially for testing scenarios.