Press "Enter" to skip to content

Month: February 2016

More On Widening Identity Columns

Aaron Bertrand has part 3 in his series on identity columns:

This post investigated two potential workarounds to either buy you time before changing your existing IDENTITY column, or abandoning IDENTITY altogether right now in favor of a SEQUENCE. If neither of these workarounds are acceptable to you, please watch for part 4, where we’ll tackle this problem head-on.

This is your weekly reminder to plan for appropriate data sizes.

Comments closed

Getting Into Power BI

Jorge Segarra jumps into Power BI:

For my contribution to this contest I’ve decided to share with you a work in progress. If you know me, I’m a huge lover of Policy-Based Management. In fact, I’m actually part of the Enterprise Policy Management Framework (EPMF) project on Codeplex. T-SQL Tuesday event is normally a DBA-centric event so I figured I’d help the DBA crowd wrap their heads around how a BI solution can help them in their day to day.

What I did to kick start this effort was to create this Power BI report that allows you to explore the database repository that contains the EPMF policy evaluation results. The current EPMF project uses Reporting Services to deliver its reports. This won’t change. If anything I’ll be exploring new capabilities with SQL Server 2016 and R-integration. Here’s a screenshot of what the SSRS dashboard report looks like:

I like this post because most Power BI examples tend to be personal (Fitbit stats, etc.) or business-y.  This is a good example of a use of Power BI for back-office database administrators.

Comments closed

T-SQL Medians

Daniel Hutmacher has a post showing how to calculate medians and percentiles in T-SQL:

Medians as a concept are simple enough. If you have a large number of values, like a range of statistical values, you want to pick the middle one. The median, as opposed to the average is useful for a number of reasons, one of them that you can reduce the effect of so-called outlier values.

The fact that SQL Server doesn’t have a fast, built-in median function surprises me, to be honest.  The best alternative I’ve found was a CLR function in SQL#.

Comments closed

Logical Data Models

Kevin Kline discusses logical data modeling:

In a recent blog post entitled Is Logical Data Modeling Dead?, Karen Lopez (b | t) comments on the trends in the data modeling discipline and shares her own processes and preferences for logical data modeling (LDM). Her key point is that LDMs are on the decline primarily because they (and their creators) have failed to adapt to changing development processes and trends.

I love all things data modeling. I found data models to be a soothing and reassuring roadmap that underpinned the requirements analysis and spec writing of the Dev team, as well as a supremely informative artifact of the Dev process which I would constantly refer to when writing new T-SQL code and performing maintenance. However, as time has passed, I have been surprised by how far it has fallen out of favor.

This is an interesting discussion.  I’m not sure I’ve ever created a true logical data model.  I’ve worked with systems which could potentially take advantage of them, but they never hit the top of the priority list.

Comments closed

Choosing Between Optimistic Concurrency Levels

Kendra Little has a cheat sheet for comparing the two optimistic concurrency levels:

SQL Server offers two flavors of optimistic locking for traditional disk-based tables: Read Committed Snapshot Isolation (RCSI), and Snapshot Isolation. They are each great tools to reduce blocking and make applications faster, particularly in databases that have lots of tiny reads and writes that need to be quick, or mixed-use patterns (lots of little reads and writes + larger reporting queries).

Both isolation levels are available in SQL Server Standard Edition and Web Edition as well as Enterprise. That’s right, they’re cheap and easy. Each of them are controlled by database level settings, and default to disabled when you install SQL Server.

The moral of the story:  both of these are awesome, both have potential drawbacks, and both need testing.  I’ve had good experiences with RCSI, but even then, maybe about 1% of procedures need specific locking hints (either NOLOCK or an explicit lock) to maintain previous application behavior and to deal with the problem Kendra brought up.  Moral of the story:  test, test, test.

Comments closed

Power BI With SSAS

Jens Vestergaard’s T-SQL Tuesday entry involves Power BI feeding from Analysis Services:

My story with this half-baked product (the Dashboard you are about to see), is that I needed some way of tracking performance on a couple of Analysis Services (SSAS) query servers. There are a lot of good posts and talks about how to collect and store performance counters and SSAS logs out there, and I suggest you look into this, this or that, if you need inspiration.

The current data set is about 200K rows, as I am sampling each server every 5th minute.

Both of these are valuable tools in a Microsoft BI environment.

Comments closed

Altering Columns

Kenneth Fisher points out that there are defaults when altering columns:

So here is the thing. When you change one you change them all. That means if you don’t specify a precision when you can then you get the default. That’s not exactly a common problem though. Usually what you are changing is the precision (or possibly the datatype). What is a common mistake is not specifying the nullability.

When modifying DDL, make sure that you keep it consistent and complete.

Comments closed

Concurrency Simulator

Michael J. Swart has come up with a concurrency simulator:

Capacity planning is difficult for DBAs who expect growth. Will there be enough CPU, Memory or I/O to serve the anticipated load? One category falls outside those three, logical contention.

Logical contention is a problem where excessive blocking causes throughput to suffer. It would be great to get advanced warning. One essential strategy is to make use of the blocked process report. The problem is that blocked process reports are an alarm metric, not a guage metric. In other words, the blocked process report can indicate when there is a problem, but it is poor at giving advanced notice.

This is a nice visual tool to begin to understand the topic.  The same principles apply to road traffic, water flow, etc.

Comments closed

Conditional Processing

Bill Fellows shows options for handling date-based conditional processing:

Do you see the problem? Really, there are two but the one I’m focused on is the use of GETDATE to determine which branch of logic is executed. Today is Monday and I need to test the logic that runs on Friday. Yes, I can run these steps in isolation and given that I’m not updating the logic that fiddles with the branches, my change shouldn’t have an adverse effect but by golly, that sucks from an testing perspective. It’s also really hard to develop unit tests when your input data is server date. What are you going to do, allocate 5 to 7 days for testing or change the server clock. I believe the answer is No and OH HELL NAH!

This isn’t just an SSIS thing, either. I’ve seen the above logic in TSQL as well. If you pin your logic to getdate/current_timestamp calls, then your testing is going to be painful.

I liken this to solving dependency injection problems in general:  make the caller define the date or date part.  That way, your test callers can define other dates and the “smarts” around which branch to take move up to a more swappable layer.

Comments closed

Getting Running SQL Agent Jobs

Andy Mallon introduces us to xp_sqlagent_enum_jobs, tells us not to use it, and gives us an alternative:

If you landed on this post, I’m guessing you know the answer to this, so I’ll be quick. xp_sqlagent_enum_jobs is an undocumented (and thus, unsupported) extended procedure that gives information on whether a job is currently running, when it last ran, when it runs next, etc.

The supported alternative is to use sp_help_job (which calls xp_sqlagent_enum_jobs), but sp_help_job is difficult to use in automation because trying to do INSERT #temp EXEC sp_help_job will result in an error due to nested INSERT…EXEC statements. As a result, people often turn to calling the unsupported xp directly.

I never actually knew about this procedure; I always just wrote queries against the msdb tables.  Andy’s solution is more elegant than what I normally come up with, though.

Comments closed