Press "Enter" to skip to content

Category: Security

Which Groups can Set Permissions in Power BI

Gilbert Quevauvilliers walks us through the groups which can set permissions in Power BI:

As you can see from above it is good to know which groups can be used to assign permissions in the Power BI Service.

If there is anything I have missed, is wrong or needs updating please let me know via the comments section below.

Thanks for reading!

Gilbert has a nice matrix as well as lots of screenshots establishing the matrix’s veracity.

Comments closed

Azure AD Passthrough and Password Hash Authentication in SQL DB, DW, MI

Mirek Sztajno announces two new security pieces for Azure SQL Database, Azure Synapse Analytics, and Azure SQL Managed Instances:

We are announcing support for Azure AD pass-through and password hash authentication for Azure SQL DB (single database and database pools), Managed Instance, and Azure Synapse (formerly SQL DW).

Azure AD password hash authentication is the simplest way to enable authentication for on-premises Active Directory users in Azure AD. Users are synchronized with Azure AD and password validation occurs in the cloud using the same username and password that is used in on-premises environments. No additional infrastructure is required.

Azure AD pass-through authentication provides a password validation mechanism that validate users directly with on-premises Active Directory, outside the cloud. Pass-through authentication does not require ADFS or other third-party federation services.

– Each of these authentication methods can be configured by Azure AD Connect, allowing you to provision users in the cloud.

Read on to see what this means for you.

Comments closed

Data Exfiltration Protection when Using Azure Databricks

Bhavin Kukadia, et al, explain how to prevent users from taking data from your Databricks cluster without authorization:

Solving for data exfiltration can become an unmanageable problem if the PaaS service requires you to store your data with them or it processes the data in the service provider’s network. But with Azure Databricks, our customers get to keep all data in their Azure subscription and process it in their own managed private virtual network(s), all while preserving the PaaS nature of the fastest growing Data & AI service on Azure. We’ve come up with a secure deployment architecture for the platform while working with some of our most security-conscious customers, and it’s time that we share it out broadly.

Click through for the architectural pattern.

Comments closed

A New Powershell Module for SQL Server Security

Stuart Moore introduces dbaSecurityScan:

How easy it to audit them? If someone asks you the DBA exactly who has access to object A, can you tell them? How do people get access to that object, is it via a role, a schema or an explicit permission?

Is that information in an easy to read or manipulate manner?

How do you ensure that permissions persist between upgrades? I’ve certainly seen 3rd party upgrades that have reset database level permissions. Do you have a mechanism to check every permission and put them back as they were?

We’re all doing the devops these days. Our database schema is source controlled, and we’re deploying it incrementally in pipelines and testing it. But are we doing that with our database security?

So in the classic open source way, I decided to scratch my own itch by writing something. That something is dbaSecurityScan, a PowerShell module that aims to offer a solution for all of the above.

Click through to see what dbaSecurityScan covers today, how to call it, and what you can do to get more info.

Comments closed

Using T-SQL to Generate a Login Migration Script

Thomas Rushton shows us how to migrate logins from one server to another using just T-SQL:

There are three things to look at:

1. Creating Logins from Windows accounts
2. Creating logins that are authenticated by SQL Server
3. Assigning membership of the appropriate server roles

Note that we’re just creating a snapshot of what’s there – we’re not aiming to keep these things in true synchronisation between servers. Note also that this script will only create accounts that don’t exist. If there’s an account already there with the same name, the script generated will not recreate it with new options.

We get the final script as well as a nice walkthrough of each component.

Comments closed

Using Azure Key Vault with Azure Databricks

Jason Bonello shows how easy it is to integrate Azure Key Vault into Azure Databricks:

In Azure Key Vault we will be adding secrets that we will be calling through Azure Databricks within the notebooks. First and foremost, this is for security purposes. It will ensure usernames and passwords are not hardcoded within the notebook cells and offer some type of control over access in case it needs to be reverted later on (assuming it is controlled by a different administrator). In addition to this, it will offer a better way of maintaining a solution, since if a password ever needs to be changed, it will only be changed in the Azure Key Vault without the need to go through any notebooks or logic.

If you don’t use Key Vault, Databricks does include its own secrets storage, so there’s really no reason to keep them in plaintext.

1 Comment

Authentication in Hadoop with Apache Ozone

Xiaoyu Yao explains how we can use Apache Ozone to perform service account authentication for a Hadoop cluster:

Like Hadoop delegation tokens, Ozone security token has a token identifier along with a signed signature from the issuer. Ozone manager issues delegation token and block tokens for users or client applications authenticated with Kerberos. The signature of the token can be validated by token validators to verify the identity of the issuer. This way, a valid token holder can use the token to perform operations against the cluster services as if they have Kerberos tickets of the issuer. 

Read on for the high-level overview.

Comments closed

Network Security Changes Around Azure SQL DB

Rohit Nayak announces some changes to Azure SQL Database’s connectivity and network security:

Now in general availability, Private Link enables users to have private connectivity from a Microsoft Azure Virtual Network to Azure SQL Database.

This feature creates a private endpoint which maps a private IP Address from the Virtual Network to your Azure SQL Database.

From security perspective, Private Link provides you with data exfiltration protection on the login path to SQL Database. Additionally, it does not require adding of any IP addresses to the firewall on Azure SQL Database or changing the connection string of your application.

Private Link is built on best of class Software Defined Networking (SDN) functionality from the Azure Networking team. Clients can connect to the Private endpoint from within the same Virtual Network, peered Virtual Networking the same region, or via VNet-to-VNet connection across regions. Additionally, clients can connect from on-premises using ExpressRoute, private peering, or VPN tunneling. More information can be found here

Click through to see what else they’ve been working on.

Comments closed

Power BI Security Features

James Serra takes us through different ways to secure your Power BI dashboards and reports:

Row-Level Security: With Row-level security (RLS) you are given the ability to publish a single report to your users but expose the data differently to each person. So instead of creating multiple copies of the same report in order to limit the data, you can just create one report that will only show the data the logged in user is allowed to see. This is done with filters, which restrict data access at the row level, and you define filters within roles. For example, creating a role called “United States” that filters the data in a table where the Region = “United States”. You then add members (user, security group, or distribution list) who can only see data for the United States to the “United States” role (the assignment of members can only be done within the Power BI Service). If a user should not have access to a report, then just don’t include that person in any of the roles for that report, so they would always see a blank report.

Click through for several more options and links to additional resources.

Comments closed

Sqoop Scheduling and Security

Jon Moirsi continues a series on Sqoop:

In previous articles, I’ve walk through using Sqoop to import data to HDFS.  I’ve also detailed how to perform full and incremental imports to Hive external and Hive managed tables.

In this article I’m going to show you how to automate execution of Sqoop jobs via Cron.

However, before we get to scheduling we need to address security.  In prior examples I’ve used -P to prompt the user for login credentials interactively.  With a scheduled job, this isn’t going to work.  Fortunately Sqoop provides us with the “password-alias” arg which allows us to pass in passwords stored in a protected keystore.

That particular keystore tie-in works quite smoothly in my experience.

Comments closed