Press "Enter" to skip to content

Category: Security

Managing Azure SQL Database Firewall Rules

Cedric Charlier shows how to manage Azure SQL Database firewall rules from within Management Studio:

When you create a new Azure database, you usually need to open the firewall to remotely administrate or query this database with SSMS. An option is to create rules from the Azure Portal. It’s surely a convenient way to do it when you create a database but I prefer to keep a minimum of tools and when the Azure portal is not open, I prefer to not have to open it just to define a few firewall rules.

Opening the firewall with SSMS is a kind of chicken and eggs problem: to connect to your database/server, you need to open the firewall. Hopefully, SSMS has a great suite of screens to call the underlying API of Azure Portal and open the firewall for the computer running SSMS.

Cedric shows off sp_delete_firewall_rule but there’s also a corresponding sp_set_firewall_rule.

Comments closed

Always Encrypted Data And SSMS

Justin Goodwin shows how to view Always Encrypted data from within Management Studio:

I am now able to view the encrypted data. Why is this?

Because I am connecting to the database from the database server, I have access to the Encryption Certificate that was generated. I can verify this by opening certmgr.msc and browsing to Personal -> Certificates:

Justin then goes on to show what happens when you don’t have access to the certificate and how to provide access to another machine.

Comments closed

Azure Disk Encryption

Melissa Coates configures Azure Disk Encryption for an already-existing Azure VM:

As I discussed in my previous blog post, I opted to use Azure Disk Encryption for my virtual machines in Azure, rather than Storage Service Encryption. Azure Disk Encryption utilizes Bitlocker inside of the VM. Enabling Azure Disk Encryption involves these Azure services:

  • Azure Active Directory for a service principal
  • Azure Key Vault for a KEK (key encryption key) which wraps around the BEK (bitlocker encryption key)
  • Azure Virtual Machine (IaaS)

Following are 4 scripts which configures encryption for an existing VM. I initially had it all as one single script, but I purposely separated them. Now that they are modular, if you already have a Service Principal and/or a Key Vault, you can skip those steps. I have my ‘real’ version of these scripts stored in an ARM Visual Studio project (same logic, just with actual names for the Azure services). These PowerShell templates go along with other ARM templates to serve as source control for our Azure infrastructure.

The Powershell scripts are straightforward and clear, so check them out.

Comments closed

Azure VM Encryption

Melissa Coates looks at different encryption methods available for Azure Virtual Machines:

Initially I opted for Storage Service Encryption due to its sheer simplicity. This is done by enabling encryption when you initially provision the storage account. After having set it up, I had proceeded onto other configuration items, one of which is setting up backups via the Azure Recovery Vault. Turns out that encrypted backups in the Recovery Vault are not (yet?) supported for VMs encrypted with Storage Service Encryption (as of Feb 2017).

Next I decided to investigate Disk Encryption because it supports encrypted backups in the Recovery Vault. It’s more complex to set up because you need a Service Principal in AAD, as well as Azure Key Vault integration. (More details on that in my next post.)

Click through for a point-by-point comparison between the two methods.

Comments closed

Replication Error When Listing Directory Contents

Andrew Peterson troubleshoots a replication issue:

You’re trying to setup SQL Server Replication on a server, and it fails. Looking thru the error message you find this:

        An exception occurred while executing a Transact-SQL statement or batch.
        (Microsoft.SQLServer.ConnectionInfo)

        Destination path ………….is not valid. Unable to list directory contents. Specify a valid
            destination path.
            Changed database context to ‘master’. (Microsoft SQL Server, Error: 14430)

Read on for the solution.

Comments closed

Encryption In ElasticMapReduce

Sai Sriparasa shows how to enable encryption in an ElasticMapReduce cluster:

In this post, I go through the process of setting up the encryption of data at multiple levels using security configurations with EMR. Before I dive deep into encryption, here are the different phases where data needs to be encrypted.

Data at rest

  • Data residing on Amazon S3—S3 client-side encryption with EMR
  • Data residing on disk—the Amazon EC2 instance store volumes (except boot volumes) and the attached Amazon EBS volumes of cluster instances are encrypted using Linux Unified Key System (LUKS)

Data in transit

  • Data in transit from EMR to S3, or vice versa—S3 client side encryption with EMR

  • Data in transit between nodes in a cluster—in-transit encryption via Secure Sockets Layer (SSL) for MapReduce and Simple Authentication and Security Layer (SASL) for Spark shuffle encryption

  • Data being spilled to disk or cached during a shuffle phase—Spark shuffle encryption or LUKS encryption

Turns out this is rather straightforward.

Comments closed

Diagnosing Execution Plan Oddities

Kendra Little digs into an oddly complex execution plan:

Aha! This is a definite clue. Some sort of security wizardry has been applied to this table, so that when I query it, a bunch of junk gets tacked onto my query.

I have no shame in admitting that I couldn’t remember at all what feature this was and how it works. A lot of security features were added in SQL Server 2016, and the whole point of a sample database like this to kick the tires of the features.

Kendra’s post frames it as an impostor syndrome check, whereas I read it as a murder mystery.

Comments closed

Securing MapR

Mitesh Shah provides some high-level information on how to secure a MapR cluster:

  • Security Best Practice #2:  Require Authentication for All Services.  While it’s important for ports to be accessible exclusively from the network segment(s) that require access, you need to go a step further to ensure that only specific users are authorized to access the services running on these ports.  All MapR services — regardless of their accessibility — should require authentication.  A good way to enforce this for MapR platform components is by turning on security.  Note that MapR is the only big data platform that allows for username/password-based authentication with the user registry of your choice, obviating the need for Kerberos and all the complexities that Kerberos brings (e.g., setting up and managing a KDC). MapR supports Kerberos, too, so environments that already have it running can use it with MapR if preferred.

There’s nothing here which is absolutely groundbreaking, but they are good practices.

Comments closed