Press "Enter" to skip to content

Category: Security

Your Power BI Administrator’s Privileges

Melissa Coates goes into exactly what it is that a Power BI admin can see and do:

I wrote about (and updated) this topic previously, but this is so important that it warrants revisiting. So let’s have a quick chat about what privileges a Power BI administrator has with respect to accessing data throughout the Power BI tenant.

All metadata throughout the tenant is available to the Power BI administrator (ex: if they want to enumerate a list of workspaces, reports, dashboards, etc using the APIs). So, metadata is easily discoverable but — technically speaking — a Power BI administrator cannot access datasets in Power BI unless they have permission to that workspace. However…

Read the whole thing.

Comments closed

Errors with SQL Server TDE and Azure Key Vault

Amit Banerjee takes us through troubleshooting issues when using Azure Key Vault as the key storage mechanism for Transparent Data Encryption:

The first one was a 404 error. When I looked the application event log, I saw the following error:

Operation: getKeyByName
Key Name: ContosoRSAKey0
Message: [error:112, info:404, state:0] The server responded 404, because the key name was not found. Please make sure the key name exists in your vault.

The simple reason for the above error is that I was using an incorrect key name or the key didn’t exist in my Azure Key Vault. So the remediation is to check if the key exists in your Azure Key Vault. If not, then create the key.

Read on for additional errors you might run into, as well as a link to an Azure Data Studio notebook to set this up yourself.

Comments closed

Securing Data on ElasticMapReduce

Duncan Chen takes us through data encryption options when using ElasticMapReduce:

Data encryption is an effective solution to bolster data security. You can make sure that only authorized users or applications read your sensitive data by encrypting your data and managing access to the encryption key. One of the main reasons that customers from regulated industries such as healthcare and finance choose Amazon EMR is because it provides them with a compliant environment to store and access data securely.

This post provides a detailed walkthrough of two new encryption options to help you secure your EMR cluster that handles sensitive data. The first option is native EBS encryption to encrypt volumes attached to EMR clusters. The second option is an Amazon S3 encryption that allows you to use different encryption modes and customer master keys (CMKs) for individual S3 buckets with Amazon EMR.

Click through for more details on each.

Comments closed

Azure AD Logins for Managed Instances

Mirek Sztajno announces a new feature for Azure SQL Managed Instances:

We are happy to announce a general availability (GA) for Azure AD server principals (Azure AD logins) for SQL managed instance (MI). This feature allows Azure AD users to create logins in the master database for MI, grant MI server level permissions for these logins and create Azure AD users with     logins for individual MI databases.

Additionally, enabling Azure AD logins allow users to execute many MI features supported for SQL logins (see the documentation at the end of this blog).

Read on to learn more about this feature.

Comments closed

Scripting Out Linked Servers with Actual Passwords

Ajay Dwivedi shows how you can script out a linked server creation statement which includes actual passwords:

For moving Logins/Users, Microsoft provided revlogin script which made it easy for migration of logins without need to know about passwords. But, there is no easy approach for migration LinkedServers with the actual password. This is where dbatools cmdlet Copy-DbaLinkedServer becomes very handy. But, what about the situation where we have to script out LinkedServer beforehand.

For this reason, based on the blog post of Antti Rantasaari, and using his code as the base script, I have created a cmdlet Get-LinkedServer with SQLDBATools module which accepts SqlInstance name as a parameter along with -ScriptOut switch, and gives Drop/Create statements for linked servers present on that local/remote SqlInstance.

As a quick note, SQLDBATools is not the same as dbatools.

Comments closed

Azure AD Credential Passthrough and Databricks

Anna Shrestinian, et al, explain how Azure Databricks enables Azure Active Directory credential passthrough when working with Azure Data Lake Storage Gen2:

Azure Data Lake Storage (ADLS) Gen2, which became generally available earlier this year, is quickly becoming the standard for data storage in Azure for analytics consumption. ADLS Gen2 enables a hierarchical file system that extends Azure Blob Storage capabilities and provides enhanced manageability, security and performance.

The hierarchical file system provides granular access control to ADLS Gen2. Role-based access control (RBAC) could be used to grant role assignments to top-level resources and POSIX compliant access control lists  (ACLs) allow for finer grain permissions at the folder and file level. These features allow users to securely access their data within Azure Databricks using the Azure Blob File System (ABFS) driver, which is built into the Databricks Runtime.

There are some tradeoffs involved, particularly around using High Concurrency clusters (or limiting yourself to one user account), but it’s a nice bit of added value when you’re a heavy Azure user.

Comments closed

skip-2.0 and SQL Server Security

K. Brian Kelley has the lowdown on skip-2.0:

Problem
I’ve read recently that there’s a new piece of malware that’s been named skip-2.0 and it targets SQL Server. What exactly is it, where did it come from, and how do I protect myself against it?

Solution
This new piece of malware, skip-2.0, does target SQL Server. Specifically, it targets SQL Server versions 11 and 12, which correspond to the names SQL Server 2012 and SQL Server 2014 respectively. Therefore, if you’re only running SQL Server 2016 or higher, you’re not affected by skip-2.0 (yet another reason to upgrade).

Read on to learn how it works and how you can protect against it.

Comments closed

JDBC Resource Pools and Kerberos

Guy Shilo has a tip for us around JDBC connectivity when your Hadoop cluster is configured for Kerberos:

This is a quick tip about connecting to Hive or Impala via JDBC.

Accessing hive or impala using their JDBC driver is very convenient. Client programs s like beeline or Jetbrains DataGrip use it as the main way of accessing Hive/Impala and many people also use it in their own written programs.

Things get a little trickier when the cluster is kerberized. In this case you should add few extra parameters to the JDBC connect string.

Read on to see what to do.

Comments closed

PowerApps Security

Jason Bonello gives us some tips on PowerApps security:

Depending on how the backend is set up, the tables having these sensitive data might be in the same database. For example, ERP solutions can have Company Accounts data, Customer related data and Inventory related data all in the same database, maybe under different schemas – but still part of the same database.

Now let’s say we are about to create a PowerApps solution to maintain Customer information. However, as part of the organization policy, this information should not be shared across other departments apart from the intended users.

Read on for some ideas of how to limit the risk of data exposure.

Comments closed

Mapping Usernames within the Power BI Gateway

Jeff Pries takes us through a difficult scenario:

With some data sources, such as Analysis Services, you want to pass the username of the person running the report back to the server executing the query (such as in a Row Level Security configuration). Adam Saxton of Guy in a Cube does a great job of explaining how this works at a high level in the video linked here.

In the video, Adam mentions that if our Power BI login does not match a UPN in our local Active Directory, then the lookup will fail…which is a problem if you just don’t have the option of having your Power BI login match a UPN in your local Active Directory and don’t want to manage static user mappings over time. The following will cover a method of allowing the lookup to use a different Active Directory property, such as “mail” to perform the matching.

Read on for the steps.

Comments closed