Press "Enter" to skip to content

Author: Kevin Feasel

Always Encrypted In Azure SQL Database

Jakub Szymaszek notes that Azure SQL Database can now support Always Encrypted:

I’m happy to announce Always Encrypted in Azure SQL Database is now generally available!

Always Encrypted is a feature designed to ensure sensitive data and its corresponding encryption keys are never revealed in plaintext to the database system. With Always Encrypted enabled, a SQL client driver encrypts and decrypts sensitive data inside client applications or application servers, by using keys stored in a trusted key store, such as Azure Key Vault or Windows Certificate Store on a client machine. As a result, even database administrators, other high privilege users, or attackers gaining illegal access to Azure SQL Database, cannot access the data.

To be honest, I’d much rather try Always Encrypted against an Azure SQL Database instance than an on-premise instance, mostly because if I hose Azure SQL Database that badly or the company decides that Always Encrypted isn’t a good fit, I can grab the data and dump the instance.  It’s a little harder to do that with physical hardware or even an on-prem VM.

Comments closed

Quantified Comparsion Predicates

Lukas Eder discusses the ALL, SOME, and ANY predicates:

Ultimately, you should always choose performance first, and then – most certainly – intuitiveness second (because some poor soul might need to maintain your query). But personally, I find these quantifiers quite elegant for three reasons:

  1. They express the quantification right where it belongs. With the comparison operator. Compare this with the solution using LIMIT, which may be far away, visually, from the greater-than operator. Quantifiers are much more concise, even than when using MAX() (in my opinion)

  2. They’re very set oriented. I like thinking in terms of sets when I work with SQL. Whenever I can omit the ORDER BY clause, I will. If only to avoid potentially slow operations (in case the database doesn’t optimise this, and a full O(N log N) sort operation is invoked)

  3. Quantified comparison predicates work on rows too, not just on single values.

I’ve known about these, but could probably count on one hand the number of times I’ve ever used one.

Comments closed

Understanding Estimation Variance

Aaron Bertrand discusses drift between estimates and actuals in execution plans:

Now, inaccurate estimates won’t always be a problem, but it can cause issues with inefficient plan choices at the two extremes. A single plan might not be optimal when the chosen range will yield a very small or very large percentage of the table or index, and this can get very hard for SQL Server to predict when the data distribution is uneven. Joseph Sack outlined the more typical things bad estimates can affect in his post, “Ten Common Threats to Execution Plan Quality:”

Read the whole thing.

Comments closed

Azure Automation

Steph Locke looks at Azure Automation:

Azure Automation is essentially a hosted PowerShell script execution service. It seems to be aimed primarily at managing Azure resources, particularly via Desired State Configurations.

It is, however, a general PowerShell powerhouse, with scheduling capabilities and a bunch of useful features for the safe storage of credentials etc. This makes it an excellent tool if you’re looking to do something with PowerShell on a regular basis and need to interact with Azure.

Read the whole thing.

Comments closed

SSIS Stream Reading And Writing

Jorge Novo has a Script task to read from a file as a stream and write to a different file as a stream:

How many times you need to stream reading from a file and writing into another well here it is…Here we are using special character char(30) and char(31) as delimited so a quick replace using regex did it. It’s Simple,Quick, and Dirty 🙂

The last ten lines are the important ones.  You’d use this stream read+write pattern when dealing with files larger than your total amount of available RAM.

Comments closed

Hadoop For .NET Developers

Elton Stoneman is working on a new Pluralsight course:

Actually I’m talking about .NET Core. That Docker image is built on Debian Jessie, and it installs Hadoop and .NET Core. You can run compiled .NET Core assemblies just by running dotnet /path/to/my/assembly.dll. The same image can run as Hadoop master and worker nodes, so all workers have .NET Core installed and can run .NET Core apps.

Which means you can run .NET Core apps as mappers or reducers in Hadoop jobs, thanks to theHadoop Streaming interface. Hadoop Streaming lets you shell out to any program the host can execute, to run the tasks for your job. Your code integrates with Hadoop at a basic level – through the standard input and output streams.

Elton has some good courses on other Hadoop-and-.NET topics (like his HDInsight deep dive course).  It looks like running containerized Hadoop + .NET Core is getting closer, but still has room for improvement.

Comments closed

Dealing With Azure Service Disruptions

Vincent-Philippe Lauzon discusses various potential solutions to Azure service disruptions:

What does happen in Azure is Service disruption.  Sometimes those are due to some hardware failures, but most of the time, they are a software problem:  an upgrade in Azure software (Microsoft’s software) gone wrong.  They happen occasionally, are typically short lived but if business continuity is a hard requirement, those are the ones you should protect your solution against.

In this article I’ll cover resilience for virtual machines.  I’ll assume that you are usingAzure Traffic Manager or other DNS service to fail over the traffic from a primary to a secondary region.

He covers “stateless” VMs (like web servers or app servers) as well as “stateful” VMs (like database servers) and explains disaster recovery options for each.

Comments closed

DBCC CLONEDATABASE

Kennie Pontoppidan talks about a new DBCC command:

This command enables you to take a copy of the schema and the statistics for an existing database into a new database. According to the documentation, DBCC CLONEDATABASE

  • Creates a new destination database that uses the same file layout as the source but with default file sizes as the model database.

  • Creates an internal snapshot of the source database.

  • Copies the system metadata from the source to the destination database.

  • Copies all schema for all objects from the source to the destination database.

  • Copies statistics for all indexes from the source to the destination database.

I knew there were ways of scripting out statistics, but DBCC CLONEDATABASE looks like a new and interesting beast.

Comments closed

Database Mail Requires TLS 1.0

Ryan Adams discovered that Database Mail cannot use TLS 1.2 at this time:

You may recall something called the POODLE attack that revealed a vulnerability in SSL 3.0 and TLS 1.0.  This particular server had SSL 3.0, TLS 1.0, and TLS 1.1 disabled in the registry.  Also note that TLS 1.2 was NOT disabled.  The server was running Windows 2012 R2.  These protocols were disabled to prevent the possibility of a POODLE attack.  If you are wondering how to disable these protocols on your servers then look at Microsoft Security Advisory 3009008.  To disable them for the whole OS scroll down to the Suggested Actions section and look under the heading “Disable SSL 3.0 in Windows For Server Software”.

I also want to note that the PCI Security Standards Council pushed back the date for getting off of SSL and TLS 1.0 to June 30th, 2018.  In addition to that, it should also be noted that Microsoft’s Schannel implementation of TLS 1.0 is patched against all known vulnerabilities.

The root cause is interesting:  it’s because Database Mail requires .NET Framework 3.5.  Ryan has more details, including a fix, so read on.

Comments closed

SSIS Deployment Models

Ginger Grant argues in favor of the project deployment model in SSIS:

Prior to SQL Server 2012, there was no project deployment. SSIS code was all deployed as packages. These packages could be stored within MSDB or they could be stored and run from the file system. In disorganized places like the one where I worked, they were deployed in both. Assuming nothing much changed since I left, they have all versions of SQL Server which were released prior to the day the new IT Director started in 2012. There was DTS on SQL Server 97, 2000 and SSIS on 2005, 2008 and 2008 R2. No reason to upgrade anything which still worked was their motto. When space was a problem, one could always go build another server. I think the LAN administrator was happiest when he was able to justify building a new server as he could spend hours shopping for parts on the internet and building the latest server.

I believe that, in the business, we call this a “nightmare scenario.”  Read on for ways in which a project model would help.

Comments closed