Access Control Basics

Robert Sheldon gives an introductory-level overview of the basics of logins, users, roles, and permissions:

You can think of a role as a type of container for holding one or more logins, users, or other roles, similar to how a Windows group can hold multiple individual and group accounts. This can make managing multiple principals easier when those principals require the same type of access to SQL Server. You can configure each role with permissions to specific resources, adding or removing logins and users from these roles as needed.

SQL Server supports three types of roles: server, database, and application. Server roles share the same scope as logins, which means they operate at the server level and pertain to the database engine as a whole. As a result, you can add only server-level principals to the roles, and you can configure the roles with permissions only to server-level securables, not database-level securables.

These help form the foundation of a secure instance, so it’s vital to understand these concepts.

vNet Peering Within An Azure Region

Denny Cherry reports that there is a public preview of a feature to allow vNet peering without setting up a site-to-site VPN connection:

Up until August 1st if you had 2 vNets in the same Azure region (USWest for example) you needed to create a site to site VPN between them in order for the VMs within each vNet to be able to see each other.  I’m happy to report that this is no longer the case (it is still the default configuration).  On August 1st, 2016 Microsoft released a new version of the Azure portal which allows you to enable vNet peering between vNets within an account.

Now this feature is in public preview (aka. Beta) so you have to turn it on, which is done through Azure PowerShell. Thankfully it uses the Register-AzureRmProviderFeature cmdlet so you don’t need to have the newest Azure PowerShell installed, just something fairly recent (I have 1.0.7 installed). To enable the feature just request to be included in the beta like so (don’t forget to login with add-AzureRmAccount and then select-AzureRmSubscription).

Read the whole thing for details on how to enroll in this feature and how to set it up.

Impersonation

Kenneth Fisher shows how to use impersonation to perform tasks without being explicitly granted permissions:

A developer wants to be able to truncate a table.

This isn’t an unreasonable request right? She’s writing a piece of code that loads a bunch of data into a staging table. She want’s to be able to truncate the table once the load is over. Unfortunately the permission required to do this is ALTER on the table. That’s not just going to let her truncate the table, it’s going to let her change the structure of the table. Not acceptable in a production environment. A development environment sure. Not a production one. So what do we do?

We use impersonation.

Check out the post to see how to do this.

Securing The Data Plane

Michael Schiebel gives an overview of security architecture inside a data lake:

Existing platform based Hadoop architectures make several implicit assumptions on how users interact with the platform such as developmental research versus production applications.  While this was perfectly good in a research mode, as we move to a modern data application architecture we need to bring back modern application concepts to the Hadoop ecosystem.  For example, existing Hadoop architectures tightly couple the user interface with the source of data.  This is done for good reasons that apply in a data discovery research context, but cause significant issues in developing and maintaining a production application.  We see this in some of the popular user interfaces such as Kibana, Banana, Grafana, etc.  Each user interface is directly tied to a specific type of data lake and imposes schema choices on that data.

Read the whole thing.  Also, “Securing the data plane” sounds like a terrible ’90s action film.

Syncing LDAP With Ranger

Colm O hEigeartaigh shows how to load users and groups into Apache Ranger from LDAP:

For the purposes of this tutorial, we will use OpenDS as the LDAP server. It contains a domain called “dc=example,dc=com”, and 5 users (alice/bob/dave/oscar/victor) and 2 groups (employee/manager). Victor, Oscar and Bob are employees, Alice and Dave are managers. Here is a screenshot using Apache Directory Studio:

Colm’s scenario uses OpenDS, but you can integrate with Active Directory as well.

Always Encrypted And Temporal Tables

Raul Gonzalez wants to combine Always Encrypted with temporal tables in SQL Server 2016:

Again the wall. There is no way you can choose a temporal table and apply encryption to a column or columns using the wizard.

I tried then using the powershell (after manually creating the keys) as this is true the only way to encrypt existing columns, just in case.

Raul doesn’t stop there, though, and he does figure out a workaround.

Enterprise R Security

Kevin Feasel

2016-07-27

R, Security

Ramkumar Chandrasekeran discusses DeployR, an enterprise security model for R:

DeployR Enterprise is designed to deliver analytics solutions at scale to whomever needs it: inside or outside the enterprise. It also guarantees secure delivery of your analytics via DeployR web services. These secure web services integrate seamlessly with existing enterprise security solutions: Single Sign-On, LDAP, Active Directory, PAM, and Basic Authentication, can enforce access privileges already defined by your IT department for existing enterprise users and also have the capability to safely support anonymous users when needed.

There’s nothing groundbreaking here:  it’s TLS (to encrypt network transmissions) and LDAPS (to control authentication and authorization).  That there’s nothing groundbreaking is a good thing—that means companies will have most of the infrastructure in place to support this.

Securing Kafka Streams

Michael Noll shows security features of Kafka Streams:

First, which security features are available in Apache Kafka, and thus in Kafka Streams?  Kafka Streams supports all the client-side security features in Apache Kafka.  In this short blog post we cannot cover these client-side security features in full detail, so I recommend reading the Kafka Security chapter in the Confluent Platform documentation and our previous blog post Apache Kafka Security 101 to familiarize yourself with the security features that are currently available in Apache Kafka.

That said, let me highlight a couple of important Kafka security features that are essential for implementing robust data infrastructures, whether these are used for building horizontal services at larger companies, for multi-tenant infrastructures (e.g. microservices), or for shared platforms such as in the Internet of Things.  Later on I will then demonstrate an example application where we use some of these security features in Kafka Streams.

It’s important to secure sensitive data, even in “transient” media like Kafka (though the transience of Kafka is user-definable, so “It’ll go away soon” isn’t really a good argument).

SSIS Firewall Rules

Slava Murygin shows how to create a firewall rule to allow SSIS connections:

Recently tried to connect to Remote SQL Server Integration Service directly from SSMS and got following error:

TITLE: Connect to Server
——————————
Cannot connect to 10.1.32.66.
——————————
ADDITIONAL INFORMATION:
Failed to retrieve data for this request. (Microsoft.SqlServer.Management.Sdk.Sfc)
For help, click: http://go.microsoft.com/fwlink?ProdName=Microsoft%20SQL%20Server&LinkId=20476
——————————
The RPC server is unavailable. (Exception from HRESULT: 0x800706BA) (Microsoft.SqlServer.DTSRuntimeWrap)
——————————
The RPC server is unavailable. (Exception from HRESULT: 0x800706BA) (Microsoft.SqlServer.DTSRuntimeWrap)
——————————
BUTTONS:
OK
——————————

Slava then shows how to work around this.

Always Encrypted In Azure SQL Database

Jakub Szymaszek notes that Azure SQL Database can now support Always Encrypted:

I’m happy to announce Always Encrypted in Azure SQL Database is now generally available!

Always Encrypted is a feature designed to ensure sensitive data and its corresponding encryption keys are never revealed in plaintext to the database system. With Always Encrypted enabled, a SQL client driver encrypts and decrypts sensitive data inside client applications or application servers, by using keys stored in a trusted key store, such as Azure Key Vault or Windows Certificate Store on a client machine. As a result, even database administrators, other high privilege users, or attackers gaining illegal access to Azure SQL Database, cannot access the data.

To be honest, I’d much rather try Always Encrypted against an Azure SQL Database instance than an on-premise instance, mostly because if I hose Azure SQL Database that badly or the company decides that Always Encrypted isn’t a good fit, I can grab the data and dump the instance.  It’s a little harder to do that with physical hardware or even an on-prem VM.

Categories

September 2017
MTWTFSS
« Aug  
 123
45678910
11121314151617
18192021222324
252627282930