Press "Enter" to skip to content

Category: Security

Failure To Connect With A SQL Login

Bert Wagner hits on the most common reason why you might fail to connect with a SQL authentication login:

I thought it would be best to start with a clean slate so I created a new SQL login and database user so that I could definitively figure out which permissions are needed.

Normally I use Windows Authentication for my logins, but this time I thought “since I’m getting crazy learning new things, let me try creating a SQL Login instead.”

After I created my login, I decided to test connecting to my server before digging into the permissions. Result?

After the fifth or sixth time it happens to you, you start making that the first thing you check.

Comments closed

Permissions And Dynamic SQL

Eric Blinn shows that dynamic SQL in stored procedures changes the security paradigm a bit:

Security was controlled by granting EXECUTE permissions only to appropriate stored procedures and by explicitly not granting permission to any tables or views within the database.

One of the procedures was getting a bad query plan and timing out. This is when I was called in. The procedure was performing a search based on an unknown number of up to 10 search parameters. The code was filled with many AND/OR combinations to account for the users’ ability to include any combination of search parameters. I found this procedure to be a prime candidate for dynamic SQL where I would create the select statement including only the search parameters entered by the user into the WHERE clause.

Immediately upon testing the users started to receive SELECT permission denied errors. It turns out that when you change to dynamic SQL and call your statement through sp_ExecuteSQL that the permissions are lost. Our options were to grant explicit select permission on the objects or to refactor the code such that it does not use dynamic SQL anymore.

The best solution here would probably be to use a certificate to sign the procedure and give that certificate user rights to select from the tables used in dynamic SQL.

Comments closed

Fun With Meltdown And Spectre

Brent Ozar looks at some of the consequences of Meltdown and Spectre for SQL Server DBAs:

That’s because some test results have found big slowdowns when the operating system is patched for Meltdown and/or Spectre. These are big vulnerabilities in the processors themselves, and OS vendors are having to make big changes that aren’t tuned for performance yet. Early benchmarks yesterday were showing 30% drops in PostgreSQL performance, but thankfully newer benchmarks have been showing smaller drops. Red Hat’s benchmarks show 3-7% slower analytics workloads, and 8-12% slower OLTP.

Joey D’Antoni has more:

Will This Impact My Performance?

Probably–especially If you are running on virtual hardware. For workloads on bare metal, the security risk is much lower, so Microsoft is offering a registry option to not include the microcode fixes. Longer term especially if you are audited, or allow application code to run on your database servers, you will need to enable the microcode options.

This will likely get better over time as software patches are released, that are better optimized to make fewer calls. Ultimately, this will need to fixed on the hardware side, and we will need a new generation of hardware to completely solve the security issue with a minimum impact.

Allan Hirt has even more:

There are two bugs which are known as Meltdown and Spectre. The Register has a great summarized writeup here – no need for me to regurgitate. This is a hardware issue – nothing short of new chips will eradicate it. That said, pretty much everyone who has written an OS, hypervisor, or software has (or will have) patches to hopefully eliminate this flaw. This blog post covers physical, virtualized, and cloud-based deployments of Windows, Linux, and SQL Server.

The fact every vendor is dealing with this swiftly is a good thing. The problem? Performance will most likely be impacted. No one knows the extent, especially with SQL Server workloads. You’re going to have to test and reset any expectations/performance SLAs. You’ll need new baselines and benchmarks. There is some irony here that it seems virtualized workloads will most likely take the biggest hit versus ones on physical deployments. Time will tell – no one knows yet.

This will have long-term ramifications.  We’ll deal with them like we’ve dealt with other issues in the past, but it does seem that, at least for now, there will be some performance hit from this.

Comments closed

Avoid Impersonation And The Trustworthy Flag

Solomon Rutzky explains how you can use module signing to avoid the security risks which come with impersonation and setting Trustworthy on:

Admittedly, using Cross-Database Ownership Chaining and/or Impersonation and/or TRUSTWORTHY are quicker and easier to implement than Module Signing. However, the relative simplicity in understanding and implementing these options comes at a cost: the security of your system.

  • Cross-DB Ownership Chaining:
    • security risk (can spoof User / DB-level)
    • db_ddladmin & db_owner users can create objects for other owners
    • Users with CREATE DATABASE permission can create new databases and attach existing databases
  • Impersonation:
    • If IMPERSONATE permission is required:
      • can be used any time
      • No granular control over permissions
    • Cross-DB operations need TRUSTWORTHY ON
    • Need to use ORIGINAL_LOGIN() for Auditing
    • Elevated permissions last until process / sub-process ends or REVERT
  • TRUSTWORTHY:
    • Bigger security risk
      • can also spoof Logins, such as “sa” !
      • If using SQLCLR Assemblies, no per-Assembly control of ability to be marked as either EXTERNAL_ACCESS or UNSAFEall Assemblies are eligible to be marked as either of those elevated permission sets.

The common theme across all three areas is no control, within a Database, over who or what can make use of the feature / option, or when it can be used.

Read the whole thing.

Comments closed

Considerations When Using HTTPS For TFS

Hamish Watson walks us through what to do when we want to start using a certificate to encrypt Team Foundation Server traffic:

I will assume that you already have TFS setup and are just using HTTP and want to make things a bit more secure with HTTPS. I am also assuming that you will be using port 443 for HTTPS traffic.

To update TFS to use HTTPS you need to do a couple of things:

  1. Have a legitimate certificate installed on the server that you can bind to

  2. Have an IP address on the server and have firewall access setup to that IP address on port 443

But there are a few more steps as well, so click through to see them all.

Comments closed

Decrypting Always Encrypted Columns In SSMS

Monica Rathbun shows how to view Always Encrypted data within Management Studio:

Viewing decrypted data within SQL Server Management Studio (SSMS) is very easy. SSMS uses .NET 4.6 and the modern SQL Server client, so you can pass in the necessary encryption options. SSMS uses the connection string to access the Master Key and return the data in its decrypted format.

First create a new SQL Connection and Click Options to expand the window.

Then go to the Additional Connections Parameters Tab of the login window and simply type column encryption setting = enabled. Then choose Connect.

Click through to see the whole demo.

Comments closed

Protecting Sensitive Data In Docker

Jatin Demla shows how to create Docker secrets:

Managing the password, access tokens and private keys are being tedious in the application. Any small mistakes accidentally expose all the secret information. Even storing such thing in docker images can be easily accessible one should just run the image in the interactive mode container and all your application code is available in containers. Docker provides secrets to protect all secret data.

This blog explains the low-level of storage information as well as secured access to docker secret. so, let’s get started.

Read the whole thing, especially if you’ve gone container-happy.

Comments closed

API Security: Contrasting Token-Based With Key-Based Security

Vincent-Philippe Lauzon contrasts token-based security (like OAuth) with key-based security for locking down APIs:

Instead of giving a nice & neatly formatted pros & cons table where all the pros have a corresponding cons, let’s just discuss the major aspects:  security & complexity.

Basically, in general, OAuth is more secure but more complex for both clients (i.e. consumer) and services.

Why is OAuth more secure?  Relying parties never see credentials & secrets in an OAuth authentication scheme.  They see a token.  Token are revoked after a while ; often minutes, maximum a few hours.

Read on for more.  My preference is OAuth, but it’s not always trivial to set up.

Comments closed

SQL Vulnerability Assessment

Ronit Reger shows off the new SQL Vulnerability Assessment, available in SQL Server 2012 and later:

Not only does VA expose some of the possible security flaws you have in your database system, it also provides remediation scripts to resolve issues within a couple of mouse clicks. In addition, you can accept specific results as your approved baseline state, and the VA scan report will be customized accordingly to expect these values.

Ronit and Alan Yu also mention it being available via the latest version of Management Studio, 17.4:

The VA service runs a scan directly on your SQL database or server. VA employs a knowledge base of rules that flag security vulnerabilities and deviations from best practices, such as misconfigurations, excessive permissions, and exposed sensitive data. The rule base grows and evolves over time, to reflect the latest security best practices recommended by Microsoft.

Results of the assessment include actionable steps to resolve each issue and provide customized remediation scripts where applicable. An assessment report can be customized for each customer environment and tailored to specific requirements. This process is managed by defining a security Baseline for the assessment results, such that only deviations from the custom Baseline are reported.

VA is supported for SQL Server 2012 and later, and can also be run on Azure SQL Database.

This looks like a good reason to upgrade SSMS.

Comments closed

Data Breaches And Knowledge-Based Authentication

Jeff Mlakar summarizes Troy Hunt’s recent congressional testimony:

Lastly, there is a lack of accountability for the breaches. If you collect data about others you are responsible for it. Yet all too often organizations discover years later they suffered a massive data breach and then proclaim to the press that they were hacked by evil doers and caught unprepared.

Then they progress through the stages of data breach grief:

  1. OMG I just read the news and found out we’ve been hacked

  2. Turns out it was 4 years ago

  3. Blame evil hackers while proclaiming innocence as a naive victim

  4. The media turns up the heat – time to blame some systems administrator

  5. Offer your customers credit monitoring

  6. Acceptance

  7. Wait until the next hack then GOTO step #1

It will be interesting to see what (if anything) comes out of this.

Comments closed