Press "Enter" to skip to content

Author: Kevin Feasel

SSMS And Azure SQL Data Warehouse

Rob Farley reports that we can now use Management Studio to connect to Azure SQL Data Warehouse:

One of the biggest frustrations that people find with SQL DW is that you need (or rather, needed) to use SSDT to connect to it. You couldn’t use SSMS. And let’s face it – while the ‘recommended’ approach may be to use SSDT for all database development, most people I come across tend to use SSMS.

But now with the July 2016 update of SSMS, you can finally connect to SQL DW using SQL Server Management Studio. Hurrah!

…except that it’s perhaps not quite that easy. There’s a few gotchas to be conscious of, plus a couple of things that caused me frustrations perhaps more than I’d’ve liked.

Yes, it’s never quite that easy…  Read the whole thing.

Comments closed

A Wish For Shutdown Procedures

Dave Mason wants shutdown procedures like how we can have startup procedures:

If ‘shutdown’ was valid for the @OptionName parameter, I know exactly what I’d do: configure sp_procoption to execute a stored proc that sends an email alert. It would be great to know when a SQL instance goes down, wouldn’t it? Looking at the log file, we can see entries indicating the instance was shut down:

Dave does mention potential failure scenarios, but I agree:  it’d be nice to have the ability to run procedures at shutdown to perform certain actions.

Comments closed

When Servers Behave Differently

Grant Fritchey tells a tale of two servers which behave quite differently when running the same query:

Now what?

Well obviously something somewhere is different. Start by comparing everything on both servers and both databases down to… hang on, here, we’ll write a PowerShell script and then….

Wait, wait, wait!

You have the execution plans? Before we start digging through all the properties everywhere and comparing everything to everything, updating statistics 14 times, and all the rest, what if we look at the execution plans. They’re different, so we should start looking at scans & indexes & statistics &….

Wait!

Sometimes, it’s the little things that matter.

Comments closed

Using Biml To Load From Flat Files

Ben Weissman shows how to use Biml to load data from XML files:

To make use of flat files, Biml will require one or more flat file formats as well as the corresponding flat file connections. A flat file connection is nothing but a link between a flat file format and the path to a flat file. While in real life, you might rather want to loop across multiple files using the same format, we’ll keep it simple for now and assume that we’ll have one file per format (which is the case here anyway due to our AdventureWorks sample data).
For reasons of readability and also the ability to easily reuse our code, we’ll make use of a new extension method called GetFlatFileFormatfromXML. If you have not read our post on creating your own extension methods, you may want to do that first.

Getting the extension method in place is about 90% of the solution.  Read on to see the whole thing.

Comments closed

Changing AG Endpoint Ports

Matt Gordon walks through changing endpoint ports:

Since a communication failure within an AG is usually a “hair on fire” kind of event, a quick fix may be desired. The quickest fix I’ve found is to change the port on which the AG endpoint is listening. While the Microsoft help pages on this contain enough information to put together the right script eventually, the first time this happened to me I really would have liked a blog post specific to this issue that gave me the right script to use.

In keeping with this T-SQL Tuesday’s theme, below is my SQL Server present. Please note that I used 5023 as an example but your choice can be flexible depending on what else is consuming ports on your machine. Hopefully this helps somebody (or me if I manage to travel back in time and encounter this same issue)

In an ideal world, you’d want a bit more control over what’s running on various ports to prevent this scenario, but that’s probably wishful thinking.

Comments closed

Always Encrypted In Azure SQL Database

Jakub Szymaszek notes that Azure SQL Database can now support Always Encrypted:

I’m happy to announce Always Encrypted in Azure SQL Database is now generally available!

Always Encrypted is a feature designed to ensure sensitive data and its corresponding encryption keys are never revealed in plaintext to the database system. With Always Encrypted enabled, a SQL client driver encrypts and decrypts sensitive data inside client applications or application servers, by using keys stored in a trusted key store, such as Azure Key Vault or Windows Certificate Store on a client machine. As a result, even database administrators, other high privilege users, or attackers gaining illegal access to Azure SQL Database, cannot access the data.

To be honest, I’d much rather try Always Encrypted against an Azure SQL Database instance than an on-premise instance, mostly because if I hose Azure SQL Database that badly or the company decides that Always Encrypted isn’t a good fit, I can grab the data and dump the instance.  It’s a little harder to do that with physical hardware or even an on-prem VM.

Comments closed

Quantified Comparsion Predicates

Lukas Eder discusses the ALL, SOME, and ANY predicates:

Ultimately, you should always choose performance first, and then – most certainly – intuitiveness second (because some poor soul might need to maintain your query). But personally, I find these quantifiers quite elegant for three reasons:

  1. They express the quantification right where it belongs. With the comparison operator. Compare this with the solution using LIMIT, which may be far away, visually, from the greater-than operator. Quantifiers are much more concise, even than when using MAX() (in my opinion)

  2. They’re very set oriented. I like thinking in terms of sets when I work with SQL. Whenever I can omit the ORDER BY clause, I will. If only to avoid potentially slow operations (in case the database doesn’t optimise this, and a full O(N log N) sort operation is invoked)

  3. Quantified comparison predicates work on rows too, not just on single values.

I’ve known about these, but could probably count on one hand the number of times I’ve ever used one.

Comments closed

Understanding Estimation Variance

Aaron Bertrand discusses drift between estimates and actuals in execution plans:

Now, inaccurate estimates won’t always be a problem, but it can cause issues with inefficient plan choices at the two extremes. A single plan might not be optimal when the chosen range will yield a very small or very large percentage of the table or index, and this can get very hard for SQL Server to predict when the data distribution is uneven. Joseph Sack outlined the more typical things bad estimates can affect in his post, “Ten Common Threats to Execution Plan Quality:”

Read the whole thing.

Comments closed

Azure Automation

Steph Locke looks at Azure Automation:

Azure Automation is essentially a hosted PowerShell script execution service. It seems to be aimed primarily at managing Azure resources, particularly via Desired State Configurations.

It is, however, a general PowerShell powerhouse, with scheduling capabilities and a bunch of useful features for the safe storage of credentials etc. This makes it an excellent tool if you’re looking to do something with PowerShell on a regular basis and need to interact with Azure.

Read the whole thing.

Comments closed

SSIS Stream Reading And Writing

Jorge Novo has a Script task to read from a file as a stream and write to a different file as a stream:

How many times you need to stream reading from a file and writing into another well here it is…Here we are using special character char(30) and char(31) as delimited so a quick replace using regex did it. It’s Simple,Quick, and Dirty 🙂

The last ten lines are the important ones.  You’d use this stream read+write pattern when dealing with files larger than your total amount of available RAM.

Comments closed