Press "Enter" to skip to content

Month: July 2016

Azure Automation

Steph Locke looks at Azure Automation:

Azure Automation is essentially a hosted PowerShell script execution service. It seems to be aimed primarily at managing Azure resources, particularly via Desired State Configurations.

It is, however, a general PowerShell powerhouse, with scheduling capabilities and a bunch of useful features for the safe storage of credentials etc. This makes it an excellent tool if you’re looking to do something with PowerShell on a regular basis and need to interact with Azure.

Read the whole thing.

Comments closed

SSIS Stream Reading And Writing

Jorge Novo has a Script task to read from a file as a stream and write to a different file as a stream:

How many times you need to stream reading from a file and writing into another well here it is…Here we are using special character char(30) and char(31) as delimited so a quick replace using regex did it. It’s Simple,Quick, and Dirty 🙂

The last ten lines are the important ones.  You’d use this stream read+write pattern when dealing with files larger than your total amount of available RAM.

Comments closed

Hadoop For .NET Developers

Elton Stoneman is working on a new Pluralsight course:

Actually I’m talking about .NET Core. That Docker image is built on Debian Jessie, and it installs Hadoop and .NET Core. You can run compiled .NET Core assemblies just by running dotnet /path/to/my/assembly.dll. The same image can run as Hadoop master and worker nodes, so all workers have .NET Core installed and can run .NET Core apps.

Which means you can run .NET Core apps as mappers or reducers in Hadoop jobs, thanks to theHadoop Streaming interface. Hadoop Streaming lets you shell out to any program the host can execute, to run the tasks for your job. Your code integrates with Hadoop at a basic level – through the standard input and output streams.

Elton has some good courses on other Hadoop-and-.NET topics (like his HDInsight deep dive course).  It looks like running containerized Hadoop + .NET Core is getting closer, but still has room for improvement.

Comments closed

Dealing With Azure Service Disruptions

Vincent-Philippe Lauzon discusses various potential solutions to Azure service disruptions:

What does happen in Azure is Service disruption.  Sometimes those are due to some hardware failures, but most of the time, they are a software problem:  an upgrade in Azure software (Microsoft’s software) gone wrong.  They happen occasionally, are typically short lived but if business continuity is a hard requirement, those are the ones you should protect your solution against.

In this article I’ll cover resilience for virtual machines.  I’ll assume that you are usingAzure Traffic Manager or other DNS service to fail over the traffic from a primary to a secondary region.

He covers “stateless” VMs (like web servers or app servers) as well as “stateful” VMs (like database servers) and explains disaster recovery options for each.

Comments closed

DBCC CLONEDATABASE

Kennie Pontoppidan talks about a new DBCC command:

This command enables you to take a copy of the schema and the statistics for an existing database into a new database. According to the documentation, DBCC CLONEDATABASE

  • Creates a new destination database that uses the same file layout as the source but with default file sizes as the model database.

  • Creates an internal snapshot of the source database.

  • Copies the system metadata from the source to the destination database.

  • Copies all schema for all objects from the source to the destination database.

  • Copies statistics for all indexes from the source to the destination database.

I knew there were ways of scripting out statistics, but DBCC CLONEDATABASE looks like a new and interesting beast.

Comments closed

Database Mail Requires TLS 1.0

Ryan Adams discovered that Database Mail cannot use TLS 1.2 at this time:

You may recall something called the POODLE attack that revealed a vulnerability in SSL 3.0 and TLS 1.0.  This particular server had SSL 3.0, TLS 1.0, and TLS 1.1 disabled in the registry.  Also note that TLS 1.2 was NOT disabled.  The server was running Windows 2012 R2.  These protocols were disabled to prevent the possibility of a POODLE attack.  If you are wondering how to disable these protocols on your servers then look at Microsoft Security Advisory 3009008.  To disable them for the whole OS scroll down to the Suggested Actions section and look under the heading “Disable SSL 3.0 in Windows For Server Software”.

I also want to note that the PCI Security Standards Council pushed back the date for getting off of SSL and TLS 1.0 to June 30th, 2018.  In addition to that, it should also be noted that Microsoft’s Schannel implementation of TLS 1.0 is patched against all known vulnerabilities.

The root cause is interesting:  it’s because Database Mail requires .NET Framework 3.5.  Ryan has more details, including a fix, so read on.

Comments closed

SSIS Deployment Models

Ginger Grant argues in favor of the project deployment model in SSIS:

Prior to SQL Server 2012, there was no project deployment. SSIS code was all deployed as packages. These packages could be stored within MSDB or they could be stored and run from the file system. In disorganized places like the one where I worked, they were deployed in both. Assuming nothing much changed since I left, they have all versions of SQL Server which were released prior to the day the new IT Director started in 2012. There was DTS on SQL Server 97, 2000 and SSIS on 2005, 2008 and 2008 R2. No reason to upgrade anything which still worked was their motto. When space was a problem, one could always go build another server. I think the LAN administrator was happiest when he was able to justify building a new server as he could spend hours shopping for parts on the internet and building the latest server.

I believe that, in the business, we call this a “nightmare scenario.”  Read on for ways in which a project model would help.

Comments closed

Introduction To R

Paul Hernandez has an introduction to using the R client and RODBC to connect to SQL Server:

The first step is to load the RevoScaleR library. This is an amazing library that allows to create scalable and performant applications with R.

Then a connection string is defined, in my case using Windows Authentication. If you want to use SQL Server authentication the user name and password are needed.

We define a local folder as the compute context.

RxInSQLServer: generates a SQL Server compute context using SQL Server R Services –documentation

Sample query: I already prepared the dataset in the view, this is a best practice in order to reduce the size of the query in the R code and for me is also easier to maintain.

I think there’s a lot of value in learning R, regardless of whether you have “data analyst” in your role or job title.

Comments closed

MDS Installation Pre-Requisites

Cody Konior explains pre-requisites for installing Master Data Services for SQL Server 2016:

Microsoft’s list of MDS prerequisites doesn’t match their PowerShell installation script “sample”.

  • The sample installs the Application Server component which is not listed in the requirements.

  • The sample installs the Application Server NET 4.5 component even though it’s not in the requirements (it lists .NET Framework 4.5 Advanced Services, but a look through other versions of the OS would indicate that this is the plain .NET Framework 4.5 Features category; I suspect someone misread this as “Application Services”).

  • The sample installs additional “Application Development” components that are not listed in the requirements.

  • The sample doesn’t install the recommended Dynamic Content Compression component.

Basically, the sample needs updated and the documentation fixed.  Click through if you’re planning to use MDS.

Comments closed

Excel MDX Performance Improvements

Chris Webb notes performance improvements in Excel 2016 for PivotTables which connect to Analysis Services:

In that post, Microsoft point out that how much of a performance increase you get will depend on a number of factors. I guess they have to do this to manage expectations. However I’m going to come right out and say that these changes are probably the most important thing that has happened in the last five years for Analysis Services or Power Pivot query performance and if you are using Excel PivotTables with SSAS or Power Pivot for reporting, you should upgrade to Excel 2016 (you’ll need the Office 365 click-to-run version) just for these changes.

The key improvement that Microsoft have made here is to stop Excel bringing back unnecessary subtotal values in the MDX queries that it generates. This has been a gigantic problem for years and several people have blogged about it: I did here, and among other people Rui Quintino and Richard Lees have posts on the subject that are worth reading. A lot of my consultancy work involves SSAS and MDX performance tuning and I think I see around five to ten customers every year who are seriously affected by this – and I’m sure this is just the tip of the iceberg. Even if you don’t think you are affected, you may have users who are putting up with slow refresh times for Excel-based reports without your knowledge.

That’s a pretty big statement in bold.  Read the whole thing.

Comments closed