Press "Enter" to skip to content

Category: Security

SQL Injection Explained

Hugo Kornelis explains SQL injection and how to fix it:

Translating this back to technical terms, the root cause of actual SQL injection vulnerabilities is a front-end program (often, but not always, a web application) that allows the user to enter data, forms a query by concatenating that data into pre-made query strings, and sends the result to the backend database server.

Once you know the root cause, the fix becomes rather obvious.

When I explain SQL injection, I like to explain it using two concepts: code versus data and technology boundaries. Ideally, your code is separate from your data, and within a single technology (e.g., in that .NET web app), that’s typically the case. But when you start to traverse boundaries, it’s convenient (and wrong!) to combine code and data together to pass everything across as a single stream of information. Parameterization is the way of keeping code and data separate as you cross those boundaries. Your data isn’t code and your code isn’t data and conflating the two is how attackers can inject arbitrary code into your system.

Incidentally, technology boundaries can happen within a single product, too: dynamic SQL is an example of this.

Comments closed

SQL Server Permissions Manager Updates

Eric Cobb has updates to the SQL Server Permissions Manager project:

I recently made some updates to my SQL Server Permissions Manager project on GitHub. These updates mainly consisted of bug fixes and documentation updates, although I did also add the ability to snapshot server level permissions such as “sysadmin” or “securityadmin”.

Given how much we want to audit permissions and how difficult it is to do a really good job of that in SQL Server, there’s space for a good auditing tool.

Comments closed

Changing Red Hat’s SSH Port On An Azure VM

Paul Randal has a post showing you how to change the default SSH port on a Red Hat Enterprise Linux VM hosted in Azure:

The steps that need to be performed are:
– Allow the new port in the RHEL firewall
– Change the SSH daemon to listen on the new port
– Add an incoming rule in the VM network security group for the new port
– Remove the rule that allows port 22

The Ubuntu process will be pretty close to this as well.

Comments closed

Password Protect Everything, Including Hadoop

George Leopold summarizes a recent Securonix report:

The malware spreads via brute-force attacks on weak passwords “or by exploiting one of three vulnerabilities found on Hadoop YARN Resource Manager, Redis [in-memory key-value store service] and ActiveMQ,” Securonix said. Once logged into database services, the malware can for example delete existing databases stored on a server and create another with a ransom note specifying a bitcoin payment.

The security analyst recommends continuous review of cloud-based services like Hadoop and YARN instances and their exposure to the Internet. Along with strong passwords, companies should “restrict access whenever possible to reduce the potential attack surface.”

It’s pretty standard advice: secure your data, password-protect your systems, and minimize the number of computers that get to touch your computers.

Comments closed

Authentication With Azure Managed Instances

Hamish Watson explains that Windows authentication is not available with Azure Managed Instances:

Azure SQL Managed Instances do not utilise windows authentication – so your two methods of authenticating applications and users are:
SQL Authentication:This authentication method uses a username and password.
Azure Active Directory Authentication:This authentication method uses identities managed by Azure Active Directory and is supported for managed and integrated domains. Use Active Directory authentication (integrated security) whenever possible.

Hamish also elaborates on some of the trickier bits about Azure Active Directory for someone used to on-prem AD solutions.

Comments closed

Rotating Expired TDE Certificates

Chris Bell shows how you can quickly rotate TDE certificates, hopefully before they expire:

We have expired or expiring SQL TDE certificates! What now?
Well, the first thing we do is not panic. Even if our TDE certificate expires it won’t cause any issues. The SQL Server will continue to work normally. Even if we restore the DB elsewhere using the expired certificate we will just get a warning that the certificate is expired.
A warning is nice, and the system still working let’s us breathe a little easier, but we know that an updated certificate is a much better thing to have. In fact, setting up a regular key rotation schedule is even better and a recommended practice.

Good information, and Chris shares scripts to make it easy.

Comments closed

Azure Databricks And Active Directory

Tristan Robinson wraps up a two-parter on Azure Databricks security:

With the addition of Databricks runtime 5.1 which was released December 2018, comes the ability to use Azure AD credential pass-through. This is a huge step forward since there is no longer a need to control user permissions through Databricks Groups / Bash and then assigning these groups access to secrets to access Data Lake at runtime. As mentioned previously – with the lack of support for AAD within Databricks currently, ACL activities were done on an individual basis which was not ideal. By using this feature, you can now pass the authentication onto Data Lake, and as we know one of the advantages of Data Lake is the tight integration into Active Directory so this simplifies things. Its worth noting that this feature is currently in public preview but having tested it thoroughly, am happy with the implementation/limitations. The feature also requires a premium workspace and only works with high concurrency clusters – both of which you’d expect to use in this scenario.

It looks like this is the way to go forward with securing Azure Databricks. Read the whole thing.

Comments closed

Azure Databricks Security

Tristan Robinson looks at what’s currently available in terms of security on Azure Databricks:

You’ll notice that as part of this I’m retrieving the secrets/GUIDS I need for the connection from somewhere else – namely the Databricks-backed secrets store. This avoids exposing those secrets in plain text in your notebook – again this would not be ideal. The secret access is then based on an ACL (access control list) so I can only connect to Data Lake if I’m granted access into the secrets. While it is also possible to connect Databricks up to the Azure Key Vault and use this for secrets store instead, when I tried to configure this I was denied based on permissions. After research I was unable to overcome the issue. This would be more ideal to use but unfortunately there is limited support currently and the fact the error message contained spelling mistakes suggests to me the functionality is not yet mature.

To be charitable, there appears to be room for implementation improvement.

Comments closed

Miminal Rights For Bulk Inserts

Timothy Smith takes us through least privilege while allowing bulk insert operations:

While this file path serves as a useful location for us to load flat files, we should consider that the user account that is executing the underlying insert statement must be able to read (and possibly write to) that file location. The writing part of the equation comes in when it involves logging, even if the permissions of the written logging data are tied down strictly in the output, in that the user doesn’t control what gets written, but that errors are written. In the least, we want to ensure that a separate folder with strict permissions exists for any flat file import to restrict the account access – notice that we’re not reading off the root drive, as we’ve seen that we can insert an entire file of data – think about using SQL bulk insert to view files through SQL Server by inserting the file’s data and reviewing it.

It’s more than just “check the box for the server-level role.”

Comments closed