Press "Enter" to skip to content

Category: Security

Stopping SQL Injection

Wayne Sheffield has a post explaining what SQL injection is and discussing how to stop it:

Me: Umm, boss… Does this report allow users to enter in search criteria?

Boss: But of course!

Me: Well, I really hate to tell you this, but we have a SQL Injection problem.

And after a bit of back and forth where the developers were insisting that no way was there a SQL Injection problem, I sat down with the dev team lead and the boss and proved it to them. We created a dummy table in the database, went to the report criteria form, and I dropped the table.

Wayne: +1000

Development Team: -1000

Injection attacks are still the most common form of attack out there.  Sadly.

Comments closed

Securing S3 Credentials In Spark Jobs

Jason Pohl shows how to protect credentials for connecting to Amazon Web Services S3 buckets when building Spark jobs:

Since Apache Spark separates compute from storage, every Spark Job requires a set of credentials to connect to disparate data sources. Storing those credentials in the clear can be a security risk if not stringently administered. To mitigate that risk, Databricks makes it easy and secure to connect to S3 with either Access Keys via DBFS or by using IAM Roles. For all other data sources (Kafka, Cassandra, RDBMS, etc.), the sensitive credentials must be managed by some other means.

This blog post will describe how to leverage an IAM Role to map to any set of credentials. It will leverage the AWS’s Key Management Service (KMS) to encrypt and decrypt the credentials so that your credentials are never in the clear at rest or in flight. When a Databricks Cluster is created using the IAM Role, it will have privileges to both read the encrypted credentials from an S3 bucket and decrypt the ciphertext with a KMS key.

That’s only one data source, but an important one.

Comments closed

Using dbatools To Back Up SQL Logins

Claudio Silva has a post showing how to use the Export-SqlLogin cmdlet to back up SQL Server logins on all databases on a set of instances:

With a database restore, the users are within a database and if their SID matches the SQL Login you are ready to go. But with the logins it is a different story!
If you have to reinstall the engine because your master database backup is corrupt or someone hs changed the login password and you want to put it back or even – maybe the most common scenario – you want to keep track of the login permissions you need to have them saved somewhere.

Imagine that you have to re-create a login and give all the permissions that it has, imagine that this login has multiple roles on different databases. Did you know that beforehand? Do you keep track of it? How much time would take to gather all that information from the application owner? How much time will you have to spend resolving all the permission issues until everything is running smoothly? How many tickets will need to be raised? How many users will this be affecting?

Read on for Claudio’s easy solution.

Comments closed

TempDB Encryption With TDE

Arun Sirpal points out an oddity in sys.databases:

If you query sys.databases, such as:

SELECT is_encrypted,name,user_access_desc FROM sys.databases WHERE database_id = 2 OR database_id = 7

It “might” throw you off. Would you not expect to see is_encrypted set to 1 for TempDB?

I thought I remembered earlier editions of SQL Server showing is_encrypted = 1 for tempdb, and I definitely remember 2016 showing 0 even when the database is encrypted.

Comments closed

Linked Servers And The Kerberos Double-Hop Problem

Jana Sattainathan shows how to set up Kerberos pass-through when dealing with linked servers:

Let us say you have SQLServer1 and you want to setup a linked server to SQLServer2 using “pass-through authentication”, a double-hop happens as explain in the article below. Basically, the first hop is when the user authenticates to SQLServer1 and the second hop when that gets passed on from SQLServer1 to SQLServer2.

The below article is a must-read before you proceed:

The three nodes involved in the double-hop as illustrated in the example are

  1. Client – The client PC from which the user is initiating connection to SQLServer1

  2. Middle server – SQLServer1

  3. Second server – SQLServer2

Dealing with the double-hop problem is far trickier than it should be; if you’ve had to deal with this, I recommend Jana’s guide.

Comments closed

Security And Zookeeper

Michael Han describes a few methods you can use to tighten up (or rather, introduce) security in ZooKeeper:

Four Letter Words (acronym as 4lw) is a very popular feature of the Apache ZooKeeper project. In a nutshell, 4lw is a set of commands that you can use to interact with a ZooKeeper ensemble through a shell interface. Because it’s simple and easy to use, lots of ZooKeeper monitoring solutions are built on top of 4lw.

The simplicity of 4lw comes at a cost: the design did not originally consider security, there is no built in support for authentication and access control. Any user that has access to the ZooKeeper client port can send commands to the ensemble. The 4lw commands are read only commands: no actions can be performed. However, they can be computing intensive, and sending too many of them would effectively create a DOS attack that prevents the ensemble’s normal operation.

Read on for details.

Comments closed

Synchronizing Logins And Jobs

Ryan Adams enumerates several methods for synchronizing logins and SQL Agent jobs across mirrored instances or Availabilty Group replicas:

There is an awesome set of PowerShell cmdlets out there written by MVP Chrissy LeMaire.  This method is my personal choice.  It works great and is easy to automate.  You can run it with SQLAgent or you can just use Scheduled Tasks in the OS.  The scheduled tasks method is a little cleaner, but you don’t get to see it in SQL Server.  Also if you are on a cluster and running Windows 2012 you can cluster the task scheduler as an added benefit.

Chrissy wrote this with the intent of making migrations easier, and she succeeded.  In fact, I made it a point to thank her at MVP Summit last year because it made my life insanely easier.  The advantage here is that you can automate a lot more than than just logins.  In fact you can migrate and automate pretty much anything at the server level.  Here is the link that I guarantee you are going to bookmark followed by a video demo where I show how to install and automate the syncing of logins using both the SQLAgent method and the Scheduled Tasks method.

https://dbatools.io/

DBATools would be my preference in this situation as well, but click through to see four other methods, as well as code.

Comments closed

Discovering Orphaned Users

Adrian Buckman troubleshoots an access scenario:

First lets check that the User Does actually exist, we know the Server login exists otherwise the user would be complaining that they cannot connect to the SQL server instance.

Sure enough – there is the user ‘SQLUndercoverUser’ lets check out the permissions:

No problems there – the user has [db_datareader], [db_datawriter] and [db_owner] so we know there is not a permissions issue, so lets test this login by connecting to SQL server with the user credentials:

Connected to the Server with no issues, lets open a new query against SQLUnderCoverDB:

Hmm so despite having  permissions to access the database  we are receiving this error – we know that the password is correct too otherwise we wouldn’t be able to access the Server at all….

Adrian does a nice job of walking through the troubleshooting process, going from simple problems (does the user actually exist? does the user have permissions?) and into the real cause, which was orphaned SQL authenticated users.  Read the whole thing.

Comments closed

Kerberos Constrained Delegation On Power BI Report Server

Regis Baccaro shows how to set up constrained delegation when connecting Power BI Report Server to a SQL Server instance or Analysis Services cube:

In many demo cases, you will have an all-in-one server where you have installed Power BI Report Server, SSAS (tabular or multidimensional) and SQL Server. In those cases you don’t need any form for credentials delegation since the Report Server is on the same box than the data source.

But there are scenarios where you have a distributed environment like the one I have on my VMs demo domain and for jumping around servers and passing credentials around, you need to setup Kerberos Constrained Delegation. Furthermore you will need protocol transition for it to work in Power BI Report Server.

Read on for step-by-step instructions showing how to do this.

Comments closed