Press "Enter" to skip to content

Category: Security

Cross-Availability Group Login Management

David Fowler walks us through a problem about orphaned users and Availability Groups:

Now, I’m pretty sure that most of us will have been in the position where, after a fail-over we get inundated with calls, emails, Skype messages and carrier pigeon drops letting us know that so and so can no longer access the database.

When you look into it, you either find that the login never existed in the first place, so you create it or that it was there but the database user has become orphaned from it (happens when the login SID doesn’t match the SID of the database user, Adrian wrote about orphaned users in Dude where’s my access?).

You remap the orphaned user and everything is good again…  that is until the next time you failover and once again you’ll be hit with the same orphaned user problem.

Click through for the explanation and a permanent fix for this issue.

Comments closed

Configuring Kafka Streams For Least Privilege

Gwen Shapira explains how we can assign minimal rights to Kafka Streams and KSQL users:

The principle of least privilege dictates that each user and application will have the minimal privileges required to do their job. When applied to Apache Kafka® and its Streams API, it usually means that each team and application will have read and write access only to a selected few relevant topics.

Organizations need to balance developer velocity and security, which means that each organization will likely have their own requirements and best practices for access control.

There are two simple patterns you can use to easily configure the right privileges for any Kafka Streams application—one provides tighter security, and the other is for more agile organizations. First, we’ll start with a bit of background on why configuring proper privileges for Kafka Streams applications was challenging in the past.

Read the whole thing; “granting everybody all rights” generally isn’t a good idea, no matter what your data platform of choice may be.

Comments closed

Working With Firewall Rules From Azure SQL Database

Arun Sirpal shows us how we can use T-SQL to set and modify firewall rules within Azure SQL Database:

For this post I want to actually show you the TSQL code to do this, hopefully it will become a good reference point for the future. Before we step into the code lets understand the differences between database level and server level rules.

For server level rules they enable access your entire Azure SQL server, that is, all the databases within the same logical server. These rules are stored in the master database. Database level rules enable access to certain databases (yes you could also run this within master) within the same logical server, think of this as you being more granular with the access where they are created within the user database in question.

Personally, I try and always use database level rules, this is especially true when I work with failover groups.

Click through for instructions on how to work with both server and database level rules.

Comments closed

Understanding Power BI Service Administrator Permissions

Melissa Coates walks us through Power BI permissions:

Based on the tests I’ve been doing, I’ve observed that users with membership to the Power BI administrator role have two sets of permissions apply:

  • Activities which are scoped across the entire organization
  • Activities for which normal user permissions apply

Within the above 2 categories, I’m thinking there are 4 main types of activities:

  1. Manage tenant settings (always scoped to the organization)
  2. Compile inventory and metadata (can be scoped to the organization)
  3. Manage workspace users (can be scoped to the organization)
  4. Export content from a workspace (relies on user permissions)

There’s a fair amount to digest, but Melissa does a good job explaining the implications of specific permissions.

Comments closed

Connection Failed With Error 772

Jack Vamvas investigates an error when trying to connect to SQL Server 2016 on Windows Server 2016:

Question: I’ve upgraded an application with a built – in Database API . When attempting to establish a SQL Server database connection this error appears – Connection failed – SQL Server Error 772 – TCPIP Socket

Upon investigation the application was using  the native drivers attempting to connect to a SQL Server 2016 \ Windows 2016

As part of the testing I downloaded the ODBC 13.1 SQL Server drivers – independent of the application and tested a DSN connection to the same SQL Server – and it connected OK. I then created a DSN with native drivers and the error reappeared.

What is going on ? How can I fix this issue?

Read on for the solution and keep those drivers up to date.

Comments closed

Operating Management Studio With Multiple Active Directory Accounts

Kenneth Fisher shows how to use different Active Directory credentials when using SQL Server Management Studio:

To help promote the seperation of duties one of the things my company has done is to divide our permissions into two accounts. We have one account that is for our daily tasks. Reading email, searching the internet, basic structure changes in a database etc. The other account is our admin account. It’s for remoting to servers, security tasks, really anything that requires sysadmin. I’m not going to argue the advisability of this because honestly, I’m kind of on the fence. That said, I do have to deal with it and there are a few tips in case you have to deal with it as well.

And if you’re not on the domain as well, runas /netonly /user:[domain\username] ssms.exe will do the job.

Comments closed

Kerberos Authentication In Apache Cassandra

Justin Cameron announces an open source Kerberos authenticator in Apache Cassandra:

In conjunction with the Cassandra authenticator, we have also published an open-source Kerberos authenticator plugin for the Cassandra Java driver.

The plugin supports multiple Kerberos quality of protection (QOP) levels, which may be specified directly when configuring the authenticator. The driver’s QOP level must match the QOP level configured for the server authenticator, and is only used during the authentication exchange. If confidentiality and/or integrity protection is required for all traffic between the client and Cassandra, it is recommended that Cassandra’s built-in SSL/TLS be used (note that TLS also protects the Kerberos authentication exchange, when enabled).

An (optional) SASL authorization ID is also supported. If provided, it specifies a Cassandra role that will be assumed once the Kerberos client principal has authenticated, provided the Cassandra user represented by the client principal has been granted permission to assume the role. Access to other roles may be granted using the GRANT ROLE CQL statement.

Click through for more details and check out the GitHub repo.

Comments closed

SMO And Clear-Text Passwords

Cody Konior looks at a case where SMO can leak SQL authentication passwords:

SMO connects to SQL Server using the ADO.NET SQLClient library which has 13+ years of features which help mask the passwords you pass in for SQL Authentication. SMO bypasses some of those features to often leak the passwords in clear-text.

We’ll prove it through repeatable tests that can be used to track if Microsoft fix the problem or not.

Read the whole thing.

Comments closed

Protecting Hadoop Clusters From Malware

Michael Yoder and Suraj Acharya remind us that Hadoop clusters are made up of computers on a network, which means people will try to install malicious software:

Roughly two years ago there were a spate of attacks against the open source database solution MongoDB, as well as Hadoop. These attacks were ransomware: the attacker wiped or encrypted data and then demanded money to restore that data. Just like the recent attacks, the only Hadoop clusters affected were those that were directly connected to the internet and had no security features enabled. Cloudera published a blog post about this threat in January 2017. That blog post laid out how to ensure that your Hadoop cluster is not directly connected to the internet and encouraged the reader to enable  Cloudera’s security and governance features.

That blog post has renewed relevance today with the advent of XBash and DemonBot.

The origin story of XBash and DemonBot illustrates how security researchers view the Hadoop ecosystem and the lifecycle of a vulnerability. Back in 2016 at the Hack.lu conference in Luxembourg, two security researchers gave a talk entitled Hadoop Safari: Hunting for Vulnerabilities. They described Hadoop and its security model and then suggested some “attacks” against clusters that had no security features enabled. These attacks are akin to breaking in to a house while the front door is wide open.

Their advice is simple, but simple is good here:  it means you should be able to implement the advice without much trouble.

Comments closed

Signing Certificates For Multi-Database Access

David Fowler shows you how to create a signed certificate which allows for cross-database access:

Recently Manish Kumar asked an interesting question, what do you do if your proc accesses multiple or even all the databases on the server?

So, instead of giving him a fuzzy answer in reply, I thought I’d write up exactly how you can deal with that sort of situation.

We’ve got two options and we’ll have a look at both of them (I’m not going to go into details about how signing procs works, please see the post mentioned earlier for a basic overview, here I’m going to look specifically at procs that access multiple databases).

Click through to see both solutions.

Comments closed