Cached Azure Analysis Services Logins

Chris Webb shows how to log into Azure Analysis Services from Management Studio as a different user:

When Azure Analysis Services was announced I had to try it out right away. Of course I didn’t read the instructions properly so when I tried to log in to my Azure Analysis Services instance from SQL Server Management Studio, like an idiot I logged in with the wrong username. The problem is that once you’ve done this, with current versions of SQL Server Management Studio there’s no way of logging out and logging in as a different user. Luckily Igor Uzhviev of Microsoft had a solution for me and I thought I’d share it for anyone else who’s made the same mistake. Here’s what you need to do:

This seems a bit much, but should just be a temporary workaround.

Semantic Layers

Melissa Coates explains the relevance of Analysis Services as a semantic layer:

Part 1: Why a Semantic Layer Like Azure Analysis Services is Relevant {you are here}

Part 2: Where Azure Analysis Services Fits Into BI & Analytics Architecture {coming soon}

Fundamentally, Analysis Services serves as a semantic layer (see below for further discussion of a semantic layer). Because the business intelligence industry now embraces an array of technology choices, sometimes it seems like a semantic layer is no longer valued like it once was. Well, my opinion is that for many businesses, a semantic layer is tremendously important to support the majority of business users who do *not* want to do their own data wrangling, data prep, and data modeling activities.

We (I) spend so much time thinking about the Brave New World of massive blobs of semi-structured data that it’s a good idea to step back every once in a while and remember that yes, there is a need for sanitized, easy-to-consume data which answers known business questions.  The percentage of people at a company willing to create an R or Python notebook or run a MapReduce job is typically well under 5%.

Testing Analysis Services Cubes

Jens Vestergaard shows how to test Analysis Services cubes using a Visual Studio test project:

Unit testing in Visual Studio is actually not that hard and can save you a lot pain down the road. The testing framework in Visual Studio offers extensive ways of executing batches of tests. You can group tests by Class, Duration, Outcome, Trait or Project.

When you right-click a test, you get the option to select how you want the tests in the Test Explorer to be grouped.

If you have an Analysis Services cube, definitely read this—testing is a vital part of software development, and automating tests can save you significant time later.

Analysis Services In Azure

Chris Webb looks at SSAS in Azure:

Support for multidimensional models will be considered for a future release, based on customer demand.

I’m pretty sure there there will be plenty of demand for Multidimensional support given the installed base that’s out there.

I hope so.  Lack of multidimensional isn’t a deal-killer, but it’s a deal-harmer.

Nested Variables In DAX

Chris Webb shows how to create nested variables inside DAX:

There aren’t any performance benefits to doing this, although of course it helps with code readability and organisation (thanks to Marius for confirming this).

Even so, click through to see an example of how to do this.

Processing 2016 Tabular From SSIS 2014

Meagan Longoria shows how to process a Tabular Model with a compatibility level of 1200 in SQL Server Integration Services 2014:

Attempting to use the AS Processing Task results in the following error: “[Analysis Services Execute DDL Task] Error: This command cannot be executed on database ‘MySSASDB’ because it has been defined with StorageEngineUsed set to TabularMetadata. For databases in this mode, you must use Tabular APIs to administer the database”

The reason for keeping SSAS processing in an SSIS package was because it kept consistent logging throughout their data refresh process. So we set out to find another solution.

Read on for the explanation and the solution.

DAX Variables

Chris Webb shows how to define variables in DAX:

Variables are the best thing to happen to DAX since, well forever – they are so cool I’m almost ready to like DAX as much as I like MDX. There are already several good articles and blog posts out there describing how to use them (see here and here), but I was looking at a Profiler trace the other day and saw something I hadn’t yet realised about them: you can declare and use variables in the DEFINE clause of a DAX query. Since my series of posts on DAX queriesstill gets a fair amount of traffic, I thought it would be worth writing a brief post showing how this works.

There are some limitations, but Chris shows a way of getting around one of them.

Power BI Row-Level Security With External Users

Patrick LeBlanc shows how to implement row-level security within Power BI for people without direct access to an underlying Analysis Services cube:

Before I explain how to fix this, let’s take a look at what’s happening behind the scenes.

  1. When opens the dashboard a connection string is created including the effectiveusername property, which is expected behavior.

  2. The value specified for this property is

  3. The connections string including the queries are sent via the On-Premises gateway to the SSAS server that hosts the data needed to view the report.

  4. Once the connection is established, using the username and password specified in the Data Source settings, all queries are executed

Read on for the solution.

Continuous Delivery With SSAS

Jens Vestergaard shows how to implement continuous deliver with Analysis Services cubes:

None of the above mentioned scenarios appeals to Team Foundation Server(TFS) and in order to get into the no-sweat zone during release time, we need to build our deployments around TFS; The obvious choice when working with Microsoft.

Natively Visual Studio, or more precisely MSBuild, does not support dwprojfiles which are used for Analysis Services (SSAS) projects. So obviously this has to involve some kind of magic. But as it turns out, it’s not all that magic. However there is not much documentation on this particular scenario out there but I managed to find one good resource, which is this. It gave me just enough assistance to complete the task.

This is a long post, but well worth reading.

Navigating Complex Tabular Models

Bill Anton has a method for understanding large tabular models on legacy platforms:

Unfortunately, the Tabular Model Explorer is only available for 2016 (compatibility 1200) tabular models – which many folks haven’t moved over to just yet (despite the overwhelming list of reasons why SQL 2016 is one of the best releases in a very long time).

Those of us stuck with 2012/2014 environments have no other option than to comb through the diagram view for that one table we’re looking for…or scan the unordered list of tables/columns in grid view, or arrow-key through a bunch of cells in the calculation pain pane to find a particular measure… or so I thought up until a few weeks ago when I discovered a better way!

At least 60-70% of the DBA population would chafe at the idea that is a “legacy” platform, so maybe I’m overstepping it a little with calling 2014 legacy.  But seriously, 2016 is a huge improvement, well worth the jump.


May 2017
« Apr