Press "Enter" to skip to content

Month: November 2015

DAX Problem Solving

Matt Allington walks us through solving a problem using DAX:

Average of best 8 scores from last 20 rounds

So the requirement is to find the average of the best 8 scores from the last 20 rounds of golf for each player.  So if you think about that problem, there are quite a few layers to it – perfect for a blog on how to break a problem into pieces so you can solve it in DAX.

Even if you do nothing with DAX, read over the post because the problem-solving technique Allington uses is generally applicable.

Comments closed

Query Store Works With Recompile

Kendra Little shows us something awesome about Query Store and OPTION(RECOMPILE):

YAY! For all my queries that were run with RECOMPILE hints, I can see information about how many times they were run, execution stats, their query text and plan, and even information about compilation.

And yes, I have the execution plans, too — the “CAST(qsp.query_plan AS XML) AS mah_query_plan” totally works.

This is great news.  Query Store is going to be a big feature for DBAs.

Comments closed

Increased Delta Store Size For NCIs

Niko Neugebauer, in part 73 of his ongoing columnstore index series:

This result was observed right after the finish of the loading script, where we can clearly see 4 Delta-Stores for 10 Million Rows. 3 of the Delta-Stores are Closed and 1 Delta-Store is Open, which is an absolutely impossible combination if we think about Clustered Columnstore Indexes, where one would expect to have 10 Compressed Row Groups or 10 Delta-Stores (9 Closed & 1 Open).
If you take a more detailed look at the associated sizes of the closed Delta-Stores, you will see that they increase each time a new Delta-Store is being used. For example, the first one is capped at 1.048.567 Rows, the second one is capped at 2.097.152 and the last closed Delta-Store is set to 4.193.904 Rows – meaning that the size is being constantly doubled.

I’d like to see this as the first step toward expanded sizes for compressed rowgroups.

Comments closed

Inline Non-Clustered Indexes

If you’re using SQL Server 2014, you get the benefit of writing inline non-clustered indexes.  Denny Cherry has more:

As for the syntax it’s pretty straight forward. Below is a sample table with a couple of indexes, one created on the column c2 and one created on C1 and C2. Now sadly include columns aren’t supported with inline indexes, but hopefully that’ll show up in a future version of SQL Server.

This was added for In-Memory OLTP support, and I like it.  For more on Denny’s comment about tempdb performance, check out a slide deck Eddie Wuerch used to teach people (including me) about temp table reuse.

Comments closed

Parameter Tables In Power BI

Chris Webb on using parameter tables in Power BI:

And there you have it: a parameter table in PowerBI.com. To be honest, I think there are slightly too many fiddly steps for users to follow in this technique for me to be happy recommending its use unconditionally, but it should be useful in some scenarios. Hopefully there will be an easier way of accomplishing the same thing in Power BI in future…

Sounds like it’s not as easy to do as in Power Query, but Chris does provide nice step-by-step instructions.

Comments closed

3D Maps In Power BI

Reza Rad gives us an introduction to 3D visualization using Power BI Desktop:

Power BI team did a great step forward with adding 3D map visual in Power BI Desktop. Thank you Microsoft Power BI team because of that! It is really useful for some scenarios that users need to see visualization on 3D map. However this feature is far behind Power Map features for story telling, creating tours, play axis, and many other features. I believe that soon many of these features will be added into Power BI. So I say Power BI said hi to 3D maps, but please be quick on that Microsoft team because Power Map raised expectations of our clients to a very high level! They will be looking for same features (at least) in Power BI desktop.

Looks like there’s still some work yet to be done.

Comments closed

Build Your Own Statistics

Dan Holmes shows how to build custom statistics:

There is a performance benefit to imported stats. The cost to compute the stats are on an “offline” table. The only downtime for the production table is the duration of the stream import.

This process does use undocumented features and it looks like it could be dangerous, but remember there is an easy undo: the update statistics statement. If something goes wrong, the statistics can always be updated using standard T-SQL.

Scheduling this code to run regularly can greatly help the optimizer produce better plans given a data set that changes over the tipping point but not enough to trigger a statistics update.

This feels like the time of thing you want to know because it’ll come in handy once, but if you feel the need to use it frequently, that may not be the best choice.

Comments closed

Use The Alt Key

Andy Mallon talks Alt key:

Everyone knows you can use SHIFT + [Left/Right Arrow] to highlight text. But you can also use ALT + SHIFT + [Up/Down/Left/Right Arrow] to select a block of text or even make a vertical selection to insert a block of text on multiple lines. Or you can use ALT + [Mouse Drag] to make a block selection with your cursor.

Notepad++ works the same way.  Every once in a while, I’ll run into a scenario in some tool which doesn’t implement Alt key functionality—especially certain non-Microsoft platform database products—and it will hurt a little bit inside.

Comments closed

Error Handling

Continuing my “classics” series, Erland Sommarskog has a three-part series (with three appendices) on error handling that will take up your entire weekend:

Why do we have error handling in our code? There are many reasons. In a forms application we validate the user input and inform the users of their mistakes. These user mistakes are anticipated errors. But we also need to handle unanticipated errors. That is, errors that occur because we overlooked something when we wrote our code. A simple strategy is to abort execution or at least revert to a point where we know that we have full control. Whatever we do, simply ignoring an unanticipated error is something we should never permit us. This can have grave consequences and cause the application to present incorrect information to the user or even worse to persist incorrect data in the database. It is also important to communicate that an error has occurred, lest that the user thinks that the operation went fine, when your code in fact performed nothing at all.

Error handling is a crucial part of development.  And given that SQL Server has…peculiarities…when it comes to error handling, I highly recommend reading this series.

Comments closed

Videography Gear

Doug Lane has an outstanding post on the types of equipment you should look at if you’re getting into technical presentations:

Your choice(s) here will drive what equipment you need to buy in order to produce a high-quality video on a reasonable budget. You don’t want to overspend on lighting and camera equipment if you’re never going to be shown bigger than a 240×180 pocket in the corner. Likewise, you may not need a USB microphone if you’re going to shoot mostly studio video.

As someone thinking about getting into podcasting and webcasting, this is a top-notch set of advice.

Comments closed