Press "Enter" to skip to content

Author: Kevin Feasel

DAX and Leap Days

Erik Svensen shows us what happens in DAX functions when February 29th rolls around:

Yesterday I visited a client and was asked – how do the time intelligence functions handle the fact that February has 29 days in 2020.

Well – in fact there was a few surprises depending on what you select from you date table.

Let’s look as some examples – I will use the following Internet Sales Amount from the years 2011-2013 from Adventure Work Database where we in February 2012 have 29 days.

Click through for those surprises. They aren’t bugs, as the functions have to do something and the possible set of valid somethings is greater than one.

Comments closed

Creating a New Database in Azure Data Studio

Dave Bland shows how you can create a new database using Azure Data Studio:

Regardless of what tool we are using, SQL Server Management Studio or Azure Data Studio, the need to create new databases is always present. Using Transact SQL is an option in both tools.  What is not an option in both tools is to right click and to go “New Database”.  This has been in SSMS for many years, however it is NOT present in Azure Data Studio.

If you really do want to create a new database using a GUI in Azure Data Studio, Dave shows you the extension you need. It’s not as fully-featured as the wizard in Management Studio, so it would make sense to understand what that wizard is doing and learn the T-SQL yourself.

Comments closed

Queueing Event Notifications with Service Broker

Max Vernon ties event notifications to Service Broker:

My previous post shows how to configure an Event Notification to fire whenever a login event occurs. The post uses Service Broker to receive those Event Notifications into a queue, which is then processed by a stored procedure and saved into a standard SQL Server database. This post provides a quick+dirty VB.Net command line monitor that shows how full a Service Broker queue is.

The following code should be pasted into a blank Visual Studio VB.Net console project. It is trivially easy to translate this into C#, but I like VB – what can I say.

Click through for the code. No F# translation from me, however, as I am lazy.

Comments closed

Resource Limitations with Azure Data Factory

Paul Andrew has a public service announcement for us:

As far as I can tell Microsoft do an excellent job at managing data centre capacity so I completely understand the reason for having limitations on resources in place. There is no such thing as a limitless cloud platform.

Note; in a lot of cases (as you’ll see in the below table for Data Factory) the MAX limitations are only soft restrictions that can easily be lifted via a support ticket. Please check before raising alerts and project risks.

Click through for the limits, and “contact support” definitely is good advice if you’re expecting to push past those limits.

Comments closed

Data Lake File Formats and Security

Ashish Kumar and Jorge Villamariona continue a series on data lakes:

People from a traditional RDBMS background are often surprised at the extraordinary amount of control that data lake architects have over how datasets can be stored. Data Lake Architects, as opposed to the Relational Database Administrators, get to determine an array of elements such as file sizes, type of storage (row vs. columnar), degrees of compression, indexing, schemas, and block sizes. These are related to the big data oriented ecosystem of formats commonly used for storing and accessing information in a data lake.

It is a bit of a different world and it comes with trade-offs. The whole thing is worth reading.

Comments closed

Tracing a Session with Extended Events

Jason Brimhall shows how you can trace a specific session using Extended Events:

The ability to quickly and easily trace a query is important to database professionals. This script provides one useful alternative to trace a specific spid similar to the method of using the context menu to create the trace within SSMS and Profiler.

This is yet another tool in the ever popular and constantly growing library of Extended Events. Are you still stuck on Profiler? Try one of these articles to help remedy that problem (here and here)

Read on to see how.

Comments closed

Parameterizing a Data Factory Linked Service to a REST API

Meagan Longoria had to parameterize a linked service connecting to a REST API recently:

In order to pass dynamic values to a linked service, we need to parameterize the linked service, the dataset, and the activity.

I have a pipeline where I log the pipeline start to a database with a stored procedure, lookup a username in Key Vault, copy data from a REST API to data lake storage, and log the end of the pipeline with a stored procedure. My username and password are stored in separate secrets in Key Vault, so I had to do a lookup with a web activity to get the username. The password is retrieved using Key Vault inside the linked service. Data Factory doesn’t currently support retrieving the username from Key Vault so I had to roll my own Key Vault lookup there.

Click through for the instructions.

Comments closed

Upgrading a SQL Server Big Data Cluster

Mohammad Darab shows how to upgrade an existing Big Data Cluster:

The above scenario was updating a Big Data Cluster from a supported release. Microsoft officially supports BDCs starting from SQL Server 2019 GDR1. But what if you have a previous version of BDCs, say CTP or release candidate? In that case you’ll have to backup any data you have, delete your cluster, uninstall azdata, install the updated azdata, and deploy your big data cluster anew. A little cumbersome but that’s how it is. In fact, no one should be running an unsupported release of Big Data Clusters anyway!

Click through for the instructions.

Comments closed

Moving Data Around in Azure Synapse Analytics

Niko Neugebauer looks at some techniques for copying data into a table in an Azure Synapse Analytics SQL Pool:

First of all, let us list some of them (and I am not even attempting on providing all of them, of course):
– INSERT INTO … SELECT FROM … (the most well known one)
– SELECT INTO … FROM … (the most well-known to perform well, since it will create a HEAP while copying most of the properties from the original table(s))
– CREATE TABLE … AS SELECT … (the old way, which must be like 10 years old on PDW/APS & Azure SQL DW, but that has never gotten into a Box Product or Azure SQL Database)
– Polybase (that will use the External Tables & externally allocated data to transfer into Azure SQL DW)
– BCP (good old tested friend that will give you a pain in the neck until you dominate it)
– OPENROWSET / BULK INSERT (some very good and very old friends with complicated histories (who remembers all the code pages?, settings and uncertain future mostly because of their original restrictions, I guess)
– COPY INTO … (the brand new command that will allow you under very neat privileges to copy data from the external storage accounts, much like BULK INSERT)

In this blog post I will simply focus on those features that have not been ported (hopefully just yet): CTAS & COPY INTO.

Read on to see how these two work. Also, I too have wanted CTAS in on-premises SQL Server for years.

Comments closed

Cloudera and R

Ian Cook shows us how Cloudera works for R users:

Our customers love it when they can use familiar syntax to work with data regardless of its size or its source. The popularity of sparklyr is a case in point: it enables R users to use either SQL or dplyr—both familiar to most R users—to work with large-scale data using Apache Spark. Two R packages developed at Cloudera—implyr and tidyquery—aim to provide this same choice of either SQL or dplyr when querying tables with Apache Impala and when manipulating R data frames.

The implyr package is new to me, but looks interesting.

Comments closed