This past weekend, after one of my presentations on Analysis Services at SQL Saturday 520, an attendee described a performance issue in his environment that he thought could be related to locking and blocking and asked if there was a way to confirm if this was the cause. The short answer is yes, absolutely!
While I wouldn’t say this is a common issue across *all* analysis services environments, it’s definitely common amongst the subset of analysis services environments that do intra-day processing – which is when the SSAS database is being processed during hours of the day when users might be simultaneously querying it.
This is a very interesting article. I’d never thought about SSAS taking locks, but it makes perfect sense.
The small font size may be difficult to read for some people. The easiest and fastest way to change the font size is to click somewhere in the formula bar and use your mouse middle wheel while holding down the Control button on your keyboard. However, the font size will go back to the default size next time you open SSDT.
If you wish to change the font size permanently, you can use the Tools menu. Click on Tools –> Options –> Fonts and Colors –> Environment and choose “Business Intelligence Designers” from the drop down menu. Here you can change the font size. Your changes will not take effect until you restart SSDT.
Unfortunately, the surrounding chrome doesn’t change size, so there are hard limits of how far you can scale up.
Now, imagine that you want a report with the Sales Amount measure on columns and Years on rows, and you want to filter the data so that you only see values for Mondays in July or Wednesdays in September. Using the Fields, Items and Sets functionality you could filter the data to only show the day/month combinations you need for each year, but since you can’t put a named set into the Filter area of a PivotTable you would have to use Excel formulas to sum up the combinations to get the totals you need
I’ve never used Excel’s Cube Functions. Looks like something to add to the to-learn pile.
The fiscal calendar assumes calendar months with the fiscal calendar shifted by some number of months. It also assumes that the fiscal year is ahead of the calendar year. That is, fiscal year 2017 starts in the first day of some month in 2016. In my example script, the fiscal year starts in October. If you have some 4-4-5, 4-5-4, or other calendar, the fiscal calendar calculations in this script won’t help you.
To add this date dimension to your SSAS Tabular project:
Right click on the Model.bim file in the Solution Explorer and choose View Code.
If you have at least one table in the model already, locate the end of the definition of the previous table, inside the table array. Add a comma after the end curly brace for the previous table.
Paste the JSON/TMSL from my Gist.
Save and close the file.
Right click on Model.Bim and choose View Designer.
Click through for the script and additional explanation.
One of the more time consuming pieces in this process is step 4 – setting up a solution that can generate a concurrent query workload. Bob Duffy (b | t) has written blogs and delivered presentations on the topic of Analysis Services load testing. However, when it comes to tooling (unless something changed recently) I believe he’s still using a custom .NET Visual Studio Test solution for the test harness. And unless you know .NET, you’re going to have a pretty difficult time getting setup… which is why I was so happy when, earlier this year, Chris Schmidt (b | t) wrote a post over on MSDN demonstrating a method for load testing an Analysis Services database using PowerShell.
This weekend I finally had some time to expand upon Chris’ code sample and add several new features… e.g. parameters (target server/database, level of concurrency, queries per batch, etc) and the ability to retrieve MDX (or DAX) queries from a SQL table. In my experience, it’s quite a bit easier to generate a table of MDX/DAX queries rather than a bunch of txt files with a query in each file.
My first thought was “Well, that doesn’t seem too complicated.” Which means that Bill did a great job.
So how do we handle the scenario where the server is rebooted?
- Option 1: always remember to restart the trace after server reboots
- Option 2: create a SQL Agent job to poll for the SSAS service status and start the xEvent trace if its not already running
- Option 3: write a custom .NET watchdog service to poll for the SSAS service status and start the xEvents trace if its not already running
Those are the options I’ve used or seen used in the past… and to be sure, all of them have their drawbacks in reliability and/or complexity.
…which is why I was so excited when it was brought to my attention that there is an “AutoRestart” option for SSAS xEvents!
Do read the whole thing.
Recently someone got in touch with me about one of my earliest posts. I realised I had made a mistake on that post and had updated. But I also realised that the script itself is actually part of a PowerShell module. So wanting to make things as easy as possible, and being the helpful guy that I am, I have altered that example into a script which I am posting here.
It’s worth stressing that the errors that output to the console are fairly generic, and that there is a log file that will contain all the information if there are any errors.
Sadly, you still need to write/tweak the big glob of XML, it seems.
Last week I posted some of my perfmon graphs from an SSAS server. I want to model the work happening on an SSAS server during multidimensional cubes processing – both dimension and fact processing.
SSAS Multidimensional Cube Processing – What Kind of Work is Happening?
There was a window at the end of the observation period where data was still flowing across the network, CPU utilization was still high enough to indicate activity, but the counters for rows read/converted/written/created per second were all zero. The index rows/sec counter was also zero.
Check it out.
OK. CPU utilization tracks pretty will with rows read/converted outside of the mystery range following 9:30.But now 3 additional interesting timeperiods are evident, in the red boxes. CPU utilization is much higher in the red boxes than would be predicted by rows read/converted.What’s the CPU doing at about 8:05, 8:45, 9:30? (Not forgetting the original question about 9:40 to about 10:15.)Maybe there’s other kinds of work I have included in these graphs?
It’s interesting to see how various metrics tie together (or, as the case may be, don’t).
Some time ago I blogged about the deprecated and discontinued functionality in SSAS 2014, so I thought it would be a good idea to follow my last post on what’s new in SSAS 2016 Multidimensional with a discussion of what’s going or gone from it.
The same page that I linked to last time has been updated for 2016, and there are four more subpages with all the details. There’s nothing much interesting to say about the breaking changes (basically AMO has been rejigged) or behaviour changes (there’s no in-place upgrade for Tabular models using DirectQuery – you have to open the project and edit some settings) but the other two pages do have some news worthy of comment:
It looks like there are some potentially interesting features on the chopping block. My Analysis Services experience is extremely limited, so I’ve never used any of them, but looks like it might be worth checking out if you have production cubes.