Once I understood the data well, I realized that the raw data had to be flattened out to expand collections (like KB) at the row level into their own row so that everything has a single value in each row. Then, the grouping is easy.
It made more sense to allow grouping not just by KB but by other columns like Product or CVE. The Group-Object works fine for most cases but since there will be duplicates after the data is grouped, it makes it easier to just do it with HashTables.
Jana provides the entire solution on his site. When reading it, I felt the urge to switch to a language which offers easier pivoting and aggregation, but the code was clear and understandable.
Quite a mouth full for a title but never the less very exciting. With the new version of SQL Server Management Studio (SSMS) 17.2 You now have the option to use Azure AD authentication for Universal Authentication with Multi-factor authentication (MFA) enabled, by that I mean use a login via SSMS that is enabled for MFA where below I will show you the two step verification using a push notification to my iPhone. (Yes iPhone I love it)
Download SSMS 17.2 from this link. https://docs.microsoft.com/en-us/sql/ssms/download-sql-server-management-studio-ssms
Once installed you will see new Authentication options, the option that I want is the one highlighted below – “Active Directory – Universal with MFA support”
Click through for a demo of this. I wonder if (when?) something like this comes to on-prem, maybe in conjunction with a third-party multi-factor authentication service.
Hopefully, Microsoft removes all traces of “Trusted Assemblies” (as I have suggested here). In either case, please just use Certificates (and possibly Asymmetric Keys, depending on your preference and situation) as I have demonstrated in these past three posts (i.e. Parts 2, 3, and 4). Even better, especially for those using SSDT, would be if Microsoft implemented my suggestion to allow Asymmetric Keys to be created from a binary hex bytes string. But, even without that convenience, there is still no reason to ever, ever, use the “Trusted Assemblies” feature.
He’s given three alternatives so far, so if you’re interested in CLR security, there’s plenty of food for thought.
Now that our instance is set up to use AAD, let’s connect to our instance from SSMS. If you’re running Management Studio 2016 and SQL Server Data Tools for 2015 (14.0.60311.1) or later you should have noticed there are some extra authentication methods available in SSMS now. We’re going to cover these out of order, since some of these options take more work than others.
Active Directory Password Authentication looks similar to a SQL authentication, but it accepts AAD User names and passwords. When you choose this method, your credentials are sent over to Azure and end at your AAD instance. Once your username and password are validated, AAD will check to see if you have rights to connect to the instance. If so, you will connect. If not, you will get an error message that you’ll need to google (bing) to find out what it really means.
With the steps we took in the last section, you should be able to log in to your Azure SQL Server as an administrator by using Active Directory Password Authentication.
Click through for the process, as well as links to additional resources.
Armed with our new knowledge, we can create a single SQL query that decodes all of the SSNs. The strategy is to define a single CTE with all ten digits and to use one
CROSS APPLYfor each digit in the SSN. Each
CROSS APPLYonly references the SSN column in the
WHEREclause and returns the matching prefix of the SSN that we’ve found so far. Here’s a snippet of the code:
Click through for progressively faster solutions. This is the main reason I do not care for DDM as a feature. Its main benefit seems to be preventing shoulder-surfing on reports; any concerted attacker with a little bit of access to writing queries can subvert it.
But that comes with a few big drawbacks. They’re really well-documented, but here’s the highlights:
Do you need to query that data from other apps? Do you have a data warehouse, reporting tools, PowerBI, Analysis Services cubes, etc? If so, those apps will also need to be equipped with the latest database drivers and your decryption certificates. For example, here’s how you access Always Encrypted data with PowerBI. Any app that expects to read the encrypted data is going to need work, and that’s especially problematic if you’re replicating the data to other SQL Servers.
Click through to read the rest. Always Encrypted was designed to encrypt a few columns, not everything in a database.
One of the questions I get when teaching others how to use Biml is how do you deal with sensitive information like usernames and passwords in your Biml Solution. No one wants to leave this information in plain text in a solution. You need access to it while interrogating your sources and destination connections for metadata. You also need it while Biml creates your SSIS packages since SSIS uses SELECT to read the metadata during design time to gather its metadata. If you lock away that sensitive information too tightly, you won’t be effective while building your solutions.
In the end, you’ll have to compromise between security and efficacy.
Read on for more.
We need to automatically read this information without our intervention. This is where you sometimes see people get lazy and add passwords in plain text into the script. That’s a big no-no, and there’s a better way.PowerShell has native support for something called the data protection API (DPAPI). DPAPI is a built-in way Windows users can use certificates to encrypt and decrypt information on the fly which is perfect for PowerShell scripting. No username and password required. We simply need a certificate installed which can be self-signed.
This is about much more than storing and decoding passwords, so check it out.
Next we take the exported pfx file and copy it locally to the temp folder of each machine and import into the local certificate store. Then we edit the registry with the thumbprint of the certificate. After that you will have to restart SQL Server to get the changes to take effect. We also clean up after ourselves and delete the pfx from the temp folder.
Note: To make this safe for production I commented out the restart of SQL Server. Also, Get-CmsHosts cmdlet can found here.
If you’re dealing with sensitive information, enabling (and forcing!) SSL encryption is one of the easiest effective ways of securing an instance; in this case, it’s securing data in transit from SQL Server to and from the client.
Robert Davis wrote a great script back when he published his Mirroring book. I started to write my own and was almost done when I contacted Robert and asked if he had dealt with SQL logins since the script only handled Windows logins. His reply was something along the lines of, “What are you talking about? Of course it handles SQL logins”. It turns out that the publisher didn’t get the right script version published with the book. That’s when this post from Robert with the full script was born…
I also wrote about it HERE.
This script creates a stored procedure to handle the move and also uses Linked Servers. If you can’t have linked servers in your environment this is not a good choice for you. However, you can create the linked server in a SQL Agent job step prior to the step for transfer and then remove it in a job step after the transfer. It breaks the rule but it does it fast enough maybe no one will notice.
Read the whole thing.