TMSCHEMA DMVs

Meagan Longoria wants Azure Analysis Services documentation:

It would be great to get the DMVs documented similar to the MDSCHEMA DMVs as they are quite useful for tasks like documenting your tabular model.  Since the TMSCHEMA DMVs work in Azure Analysis Services as well, I have logged this request on the Azure AS User Voice for that. Please lend me a vote so we can make this information more easily available.

Please vote on this.

Capturing SSAS Query Activity

Bill Anton explains why and how he captures query activity by user in SSAS:

In most environments, it is trivial to obtain the name of the user who ran each query… all you have to do was capture the [QueryEnd] event in a profiler/xevent trace and pull the information from the [NTUserName] field. However, in environments involving Power BI and the Enterprise On-Premise Data Gateway, there’s a bit more to it.

The main issue is how authentication is handled in this type of architecture. When working with Power BI reports connected to an on-premise data source via the On-Premise Data Gateway, the account of the user running the report is passed as the “EffectiveUsername”. The implication here is that the value shown in the [NTUserName] field of the xevent/profiler trace is going to be the Data Gateway account – NOT the account of the user who actually generated the activity.

Read on for the full answer.

Details On Azure SSAS

James Serra breaks down what Azure Analysis Services has to offer:

  • Developers can use SQL Server Data Tools (SSDT) in Visual Studio for creating models and deploying them to the service.  Administrators can manage the models using SQL Server Management Studio (SSMS) and investigate issues using SQL Server Profiler

  • Business users can consume the models in any major BI tool.  Supported Microsoft tools include Power BI, Excel, and SQL Server Reporting Services.  Other MDX compliant BI tools can also be used, after downloading and installing the latest drivers

  • The service currently supports tabular models (compatibility level 1200 only).  Support for multidimensional models will be considered for a future release, based on customer demand

Between tabular-only support and the max size being 100 GB (if I’m reading this correctly), they’re not yet ready to push the product hard.  Given that it just came out, that makes sense, and hopefully the training wheels come off.

Cached Azure Analysis Services Logins

Chris Webb shows how to log into Azure Analysis Services from Management Studio as a different user:

When Azure Analysis Services was announced I had to try it out right away. Of course I didn’t read the instructions properly so when I tried to log in to my Azure Analysis Services instance from SQL Server Management Studio, like an idiot I logged in with the wrong username. The problem is that once you’ve done this, with current versions of SQL Server Management Studio there’s no way of logging out and logging in as a different user. Luckily Igor Uzhviev of Microsoft had a solution for me and I thought I’d share it for anyone else who’s made the same mistake. Here’s what you need to do:

This seems a bit much, but should just be a temporary workaround.

Semantic Layers

Melissa Coates explains the relevance of Analysis Services as a semantic layer:

Part 1: Why a Semantic Layer Like Azure Analysis Services is Relevant {you are here}

Part 2: Where Azure Analysis Services Fits Into BI & Analytics Architecture {coming soon}

Fundamentally, Analysis Services serves as a semantic layer (see below for further discussion of a semantic layer). Because the business intelligence industry now embraces an array of technology choices, sometimes it seems like a semantic layer is no longer valued like it once was. Well, my opinion is that for many businesses, a semantic layer is tremendously important to support the majority of business users who do *not* want to do their own data wrangling, data prep, and data modeling activities.

We (I) spend so much time thinking about the Brave New World of massive blobs of semi-structured data that it’s a good idea to step back every once in a while and remember that yes, there is a need for sanitized, easy-to-consume data which answers known business questions.  The percentage of people at a company willing to create an R or Python notebook or run a MapReduce job is typically well under 5%.

Testing Analysis Services Cubes

Jens Vestergaard shows how to test Analysis Services cubes using a Visual Studio test project:

Unit testing in Visual Studio is actually not that hard and can save you a lot pain down the road. The testing framework in Visual Studio offers extensive ways of executing batches of tests. You can group tests by Class, Duration, Outcome, Trait or Project.

When you right-click a test, you get the option to select how you want the tests in the Test Explorer to be grouped.

If you have an Analysis Services cube, definitely read this—testing is a vital part of software development, and automating tests can save you significant time later.

Analysis Services In Azure

Chris Webb looks at SSAS in Azure:

Support for multidimensional models will be considered for a future release, based on customer demand.

I’m pretty sure there there will be plenty of demand for Multidimensional support given the installed base that’s out there.

I hope so.  Lack of multidimensional isn’t a deal-killer, but it’s a deal-harmer.

Nested Variables In DAX

Chris Webb shows how to create nested variables inside DAX:

There aren’t any performance benefits to doing this, although of course it helps with code readability and organisation (thanks to Marius for confirming this).

Even so, click through to see an example of how to do this.

Processing 2016 Tabular From SSIS 2014

Meagan Longoria shows how to process a Tabular Model with a compatibility level of 1200 in SQL Server Integration Services 2014:

Attempting to use the AS Processing Task results in the following error: “[Analysis Services Execute DDL Task] Error: This command cannot be executed on database ‘MySSASDB’ because it has been defined with StorageEngineUsed set to TabularMetadata. For databases in this mode, you must use Tabular APIs to administer the database”

The reason for keeping SSAS processing in an SSIS package was because it kept consistent logging throughout their data refresh process. So we set out to find another solution.

Read on for the explanation and the solution.

DAX Variables

Chris Webb shows how to define variables in DAX:

Variables are the best thing to happen to DAX since, well forever – they are so cool I’m almost ready to like DAX as much as I like MDX. There are already several good articles and blog posts out there describing how to use them (see here and here), but I was looking at a Profiler trace the other day and saw something I hadn’t yet realised about them: you can declare and use variables in the DEFINE clause of a DAX query. Since my series of posts on DAX queriesstill gets a fair amount of traffic, I thought it would be worth writing a brief post showing how this works.

There are some limitations, but Chris shows a way of getting around one of them.

Categories

July 2017
MTWTFSS
« Jun  
 12
3456789
10111213141516
17181920212223
24252627282930
31