Press "Enter" to skip to content

Author: Kevin Feasel

Capturing SQL Server Perfmon Counters

Andy Galbraith shows how to collect and store Perfmon counters:

As you can see, Page Life Expectancy (PLE) on this graph dips, gradually climbs, and then dips again.  With a collection every five minutes you may not catch the exact peak – all you know is that the PLE was 50,000 at 12:55am and then only 100 at 1:00am on 03/13.  It may have climbed higher than that before it dipped, but by 1:00am it had dipped down to around 100 (coincidentally at 1am the CheckDB job had kicked off on a large database).

If you really need to know (in this example) exactly how high PLE gets before it dips, or exactly how low it dips, or at what specific time it valleys or dips, you need to actively watch or set up a collector with a more frequent collection.  You will find that in most cases this absolute value isn’t important – it is sufficient to know that a certain item peaks/valleys in a certain five minute interval, or that during a certain five minute interval (“The server was slow last night at 3am”) a value was in an acceptable/unacceptable range.

Andy also gives us a set of counters he uses by default and how to set up automated counter collection.  Left to the reader is integrating that into an administrator’s workflow.

Comments closed

Analysis Services Performance Monitoring

Bill Anton has a cheat sheet for SSAS performance monitoring extended events:

In this post we ran through the list of Extended Events that you’ll want to collect along with an explanation on the type of performance-related information you can derive from them. Depending on the type of instance you’re monitoring, we’re only really only talking about 5 (multidimensional) or 6 (tabular) events that need to be captured.

If you are in charge of administering Analysis Services cubes, Bill’s series is an excellent way of making sure you’re keeping up on good monitoring practices.

Comments closed

Azure SQL Database Security

James Serra has a number of links to Azure SQL Database security features:

Life we be so much easier if we could just trust everyone, but since we can’t we need solid security for our databases.  Azure SQL Database has many security features to make you sleep well at night:

Most of these are exactly the same as the on-premise product—at least the SQL Server 2016 version—but it goes to show that Azure SQL Database has grown up quite a bit.

Comments closed

Documenting Permissions

Chris Bell has a new script to audit SQL Server permissions:

I wanted the script to do a few things. Tell me who is in a AD Group that was granted rights. IT is one thing to see the group name, but that doesn’t really tell me who has access. I also wanted the output to be a little more user readable, so I formatted the output some. There are other things I did too, but you can fun reading through the code and comparing the 2 sources.

Knowing who’s allowed to do what is key to having a successful security posture.  This script won’t tell you object-level permissions, but at least gives you an idea about role and group membership.

Comments closed

CSVs With Line Breaks In Power BI

Chris Webb talks about one of the banes of my existence:  flat files with “rogue” line breaks:

But what if you can’t fix the source data? I was asked this question the other week, and since I had been asked about it before and not come up with a good answer, I decided to spend some time researching the problem.

What I found was that it was relatively easy to write some M code that gave me the correct results, but very hard to write code that performed acceptably well on a large data set (I was testing on a CSV file containing almost half a million rows). Here’s the code for the function I ended up with:

It’s nice to see that Power Query & Power BI have methods to get around this sort of issue, but it sounds like even those methods are limited in value.

Comments closed

Database Maintenance

SQLWayne describes his maintenance routines:

The most critical thing as a SQL Server DBA is to ensure that your databases can be restored in the event of the loss of a server for whatever reason: disk crash, fire in the server room, tribble invasion, whatever.  To do this, not only do you have to back up your databases, you also have to test restores!  Create a database and restore the backups of your production DB to them.  It’s the safest way to make sure that everything works.  This test restore can be automated to run every night, but that’s outside the scope of what I want to talk about right now.

There are lots of places that problems can creep in, this is just one part of how you’ll need to monitor systems.  This is how I’ve done things for a number of years, and thus far it has served me well.

Depending upon your instance count, average database size, maintenance windows, etc. etc. etc., some of these things may or may not work, but the principle is the same:  protect the data, and automate your processes to protect that data.  This is a good article to read for ideas, and then from there dig into other administrative blog posts, videos, and books to become better versed in the tools and techniques available to protect your data.

Comments closed

Luhn Testing In T-SQL

Phil Factor shows us the Luhn algorithm, a quick test to determine if a credit card number is potentially valid:

There are many ways of doing it in SQL. (and Rosetta Code is a good place to view solutions in various other languages). I believe that Peter Larsson holds the record for the fastest calculation of the Luhn test for a sixteen-digit credit card, with this code. As it stands, it isn’t a general solution, but it can be modified for different lengths of bank card.

Phil has two interesting T-SQL functions in the code and wants to find more.

Comments closed

Using The Import-Export Wizard

James Anderson uses the Import-Export wizard to import data:

Whilst working on an upcoming post I realised that I needed a decent data set on which to test some of the new visualisations in SSRS 2016. I remembered this post on open data sets by Kendra Little that mentioned Data.gov. There are a lot of different data sets here but I settled on a set describing social media usage by organisations in New York.

The Import-Export wizard is nice for these types of one-off data loads—it’s the gateway into SSIS.

Comments closed

Unit Testing A Function

Steve Jones walks through a practical example of unit testing T-SQL with tsqlt:

However I wanted to add some tests. Does this really work? What if I don’t have a backslash? I thought the best way to do this was with a few tSQLt tests, which I quickly built. The entire process was 5-10 minutes, which isn’t a lot longer than if I had been running random tests myself with a variety of strings.

The advantage of tests is that if I come up with a new case, or another potential bug, I copy the test over, change the string and I have a new test, plus all the regressions. I’m not depending on my memory to run the test cases.

I first put the code in a function, which makes it easier to test.

tsqlt is a great tool for database unit testing.

Comments closed