Press "Enter" to skip to content

Category: Security

Handling Expiring Encryption Keys

Ed Leighton-Dick explains how safely to replace a SQL Server certificate which is about to expire:

So, now that we know what we need to rotate, how do we do it?

First, obtain a new certificate. SQL Server has the capability to generate its own certificates. For many purposes, that’s enough. However, if your company has to comply with auditing or regulatory requirements, you may need to obtain the new certificate from an outside source. Often, this is a third-party certificate authority. Some companies use a system called Encryption Key Management (EKM, also known as a Hardware Security Module, or HSA, after the device used to store the master key). (Obtaining an external certificate is a subject for an upcoming post.)

However you obtained the certificate, install it. Make sure to back it up securely, including the private key.

Next, add the new certificate to the symmetric key. The ALTER CERTIFICATE command has a clause that does just that – ADD ENCRYPTION BY.

Finally, remove the old certificate from the symmetric key. You’ll again use ALTER CERTIFICATE, but this time with the DROP ENCRYPTION BY clause.

Click through for instructions, including scripts.  Ed also explains how to update the certificate used with Transparent Data Encryption.

Comments closed

The Basics Of Always Encrypted

Josip Saban has an article on Always Encrypted in SQL Server 2016:

Always Encrypted is a client-side encryption technology that Microsoft introduced with SQL Server 2016. Always Encrypted keeps data automatically encrypted, not only when it is written, but also when it is read by an approved application. Unlike Transparent Data Encryption, which encrypts the data and log files on disk in real time but allows the data to be read by any application that queries the data, Always Encrypted requires your client application to use an Always Encrypted-enabled driver to communicate with the database. By using this driver, the application securely transfers encrypted data to the database that can then be decrypted later only by an application that has access to the encryption key. Any other application querying the data can also retrieve the encrypted values, but that application cannot use the data without the encryption key, thereby rendering the data useless. Because of this encryption architecture, the SQL Server instance never sees the unencrypted version of the data.

At this time, the only Always Encrypted-enabled drivers are the .NET Framework Data Provider for SQL Server, which requires installation of .NET Framework version 4.6 on the client computer, and the JDBC 6.0 driver. That will probably change in time, but these are the official Always Encrypted requirements as of April 2017.

This is a good intro to the topic if you aren’t familiar and are thinking of migrating to SQL Server 2016 or later.

Comments closed

Schema Modification Locks With CC Compliance

Lori Brown explains why you might see schema modification locks after enabling Common Criteria compliance:

We have a client who has no idea how or when Common Criteria was enabled on their production system. All they know is that performance has been slowly degrading. After collecting performance data, we found that there were high LCK_M_SCH_M waits which is a schema modification lock that prevents access to a table while a DDL operation occurs. We also found blocked process records where a LOGIN_STATS table in the master database was waiting a lot. This table is used to hold login statistics. When there are a lot of logins and outs there can be contention in this table.

When you enable Common Criteria compliance, something called Residual Information Protection (RIP) is enabled. RIP is an additional security measure for memory and it makes it so that in memory a specific bit pattern must be present before memory can be reallocated(overwritten) to a new resource or login. So with lots of logins and outs, there is a performance hit in memory because overwriting the memory allocation has to be done.

It’s worth reading the whole thing.

Comments closed

Stop Using Domain Admin!

Sean McCown has had it with people using Domain Admin accounts as service accounts:

If you paid close attention, you’ll notice the ‘DomainAdmin’ portion of that name.  Yep, you got it right… they were running SSRS under the domain admin account.  The Windows guy thought that it would be too much trouble to manage the permissions and get everything right on all the shares and DBs that it needed to access.

So this is when I pretty much lost it.  These guys were running  SSRS under a domain admin account because they were too lazy to do the right thing.  It’s unthinkable.  There may be some reasonable excuses why you’re not able to change your current security model to something better.  You may even be able to convince me that you’re not just being lazy.  But to actively be lazy about your security isn’t something I’m going to take lying down.  Hey, I know it’s your shop, and I know you can ultimately do whatever you like, but I’m going to make sure you know what you’re doing.

Your SQL Server (and related) service accounts should not be Domain Admin.  Period.  This isn’t one of those “Well, it depends…” types of scenarios; there is no reason ever to use an account with Domain Admin rights as a SQL Server service account, and it is security malpractice to do so.

Comments closed

Database Role Members

Nate Johnson has a quick demo showing how to add users to database roles:

The “old” method involves calling system stored-procedures, sp_addrolemember and sp_droprolemember, in which you pass the role-name and username.  The “new” method, supported starting with SQL 2012, is to use the command-phrases ALTER ROLE [role] ADD MEMBER [user], and ALTER ROLE [role] DROP MEMBER [user].

Nate shows both methods, so check it out.

Comments closed

One Surefire Guarantee For Slow Performance

Patrick Keisler troubleshoots an issue where the buffer pool gets flushed each morning:

What is the McShield service? A quick Bing search revealed that it’s one of the services for McAfee VirusScan Enterprise. Could this be the cause? To get a quick look at all the history, I filtered the application log for event IDs: 17890 and 5000. Each time McAfee got an updated virus DAT file, SQL Server soon followed that by paging out the entire buffer pool. I checked the application log on several other SQL Servers for the same event IDs, and sure enough the same events occurred in tandem each morning. I also got confirmation from the security administration team that McAfee is scheduled to check for new DAT files each morning around 8AM. Eureka!

This seems like it could be the cause of our paging, but a little more research is needed. Searching the McAfee knowledge base, lead me to this article about the “Processes on enable” option.

Enabling this option causes memory pages of running processes to get paged to disk. And the example given is “Oracle, SQL, or other critical applications that need to be memory-resident continually, will have their process address space paged to disk when scan Processes On Enable kicks in”. OUCH! So when the McAfee service starts up or it gets a new DAT file, it will page out all processes.

Fortunately, this is a setting you can turn off, and Patrick shows how.

Comments closed

Dynamic Data Masking

Andrea Allred has been checking out Dynamic Data Masking in SQL Server 2016:

This is a great time to talk about the different masking functions and what they do.  The four types in 2016 are Default, Email, Random and Custom String.

Default – For numeric and binary it will show a “0” For a date it will show 01/01/1900 and for strings it will show xxxx’s (more or less depending on the size of the field).

Email – It will expose the first letter of the email address and the suffix at the end of the email (.com, .net, .edu etc.) For example Batgirl@DC.com  would now be bxxx@xxxx.com.

Random – Number randomly generated between a set range. Kind of like the game, “Pick a number between 1 and 10” but for SQL.

Custom String – Lets you get creative with how much you show or cover and what you use to cover (not stuck with just xxxx’s).

It’s not really a security feature, but it could be useful for protecting sensitive data from snoopers glancing over the shoulder.

Comments closed

Windows Versus SQL Authentication

Kenneth Fisher looks into why we generally consider Windows authentication more sound than SQL authentication:

A quick search on the internet took me here: Choosing an Authentication Mode. And if you go down to the section Connecting Through Windows Authentication it points out a few important things and then even farther down the section Disadvantages of SQL Server Authentication has a bit more. Then I found a couple of good forum questions here and here. In summary (and only discussing actual security features):

Click through for the answers.  Also read Cristian Satnic’s comments below, as Cristian is correct about wanting to keep passwords hashed instead of encrypted.  Incidentally, Windows passwords aren’t encrypted, either—they’re hashed.

Comments closed

Securing Azure Storage

Christos Matskas has an article on securing Azure blobs and containers:

All communication with the Azure Storage via connection strings and BLOB URLs enforce the use of HTTPS, which provides Encryption in Transit. You can enforce the use of “Always HTTPS” by setting the connection string like this: “DefaultEndpointsProtocol=https;AccountName=myblob1…” or in SAS signatures, as in the example below:

https://myblob1.blob.core.windows.net/?sv=2015-04-05&ss=bfqt&srt=sco&sp=rwdlacup&se=2016-09-22T02:21:41Z&st=2016-09-21T18:21:41Z&spr=https&sig=hxInpKBYAxvwdI9kbBglbrgcl1EJjHqDRTF2lVGeSUU%3D

To protect data at rest, the service provides an option to encrypt the data as they are stored in the account. There’s no additional cost associated with encrypting the data at rest and it’s a good idea to switch it on as soon as the account is created. There is a one-click setting at the Storage Account level to enable it, and the encryption is applied on both new and existing storage accounts. The data is encrypted with AES 256 cipher and it’s now generally available to all Azure regions and Azure clouds (public, government etc)

There’s some good information here, making it worth the read.

Comments closed

Blob Auditing For Azure SQL Database

Patrick Keisler shows how to use Blob Auditing with Azure SQL Database to log database activity:

If you have multiple objects or actions to audit, then just separate them with a comma, just like the AuditActionGroups parameter. The one key piece to remember is you must specify all audit actions and action groups together with each execution of Set-AzureRmSqlDatabaseAuditingPolicy. There is no add or remove audit item. This means if you have 24 actions to audit and you need to add one more, then you have to specify all 25 in the same command.

Now let’s run a few queries to test the audit. First, we’ll run a simple select from the Salaries table.

Patrick shows off the UI (which is nice for one-off checking) and also the function sys.fn_get_audit_file(), which you’d probably want to use for automated audit checks.

Comments closed