Press "Enter" to skip to content

Day: October 22, 2018

Using cdata To Created Faceted Plots

Nina Zumel shows how to use the cdata package to create faceted ggplot2 plots:

First, load the packages and data:

library("ggplot2")
library("cdata")

iris <- data.frame(iris)

Now define the data-shaping transform, or control table. The control table is basically a picture that sketches out the final data shape that I want. I want to specify the x and y columns of the plot (call these the value columns of the data frame) and the column that I am faceting by (call this the key column of the data frame). And I also need to specify how the key and value columns relate to the existing columns of the original data frame.

Read on to see how you can use cdata to tie together different faceted plots.

Comments closed

Generating E-Mail Alerts From Perfmon

Dave Bermingham shows how to send e-mail alerts based on Perfmon counter values:

The first thing that you need to do is write a Powershell script that when run can send an email. While researching this I discovered many ways to accomplish this task, so what I’m about to show you is just one way, but feel free to experiment and use what is right for your environment.

In my lab I do not run my own SMTP server, so I had to write a script that could leverage my Gmail account. You will see in my Powershell script the password to the email account that authenticates to the SMTP server is in plain text. If you are concerned that someone may have access to your script and discover your password then you will want to encrypt your credentials. Gmail requires and SSL connection so your password should be safe on the wire, just like any other email client.

It’s an interesting use of built-in Windows functionality to perform alerting.

Comments closed

Baslining Modern Versions Of SQL Server

Erin Stellato goes back over an older baselining article and gives us some updates in what we should consider for more recent versions:

Last week I got an email from a community member who had read this older article of mine on baselining, and asked if there were any updates related to SQL Server 2016, SQL Server 2017, or vNext (SQL Server 2019). It was a really good question. I haven’t visited that article in a while and so I took the time to re-read it. I’m rather proud to say that what I said then still holds up today.

The fundamentals of baselining are the same as they were back in 2012 when that article was first published. What is different about today? First, there are a lot more metrics in the current release of SQL Server that you can baseline (e.g. more events in Extended Events, new DMVs, new PerfMon counters,  sp_server_diagnostics_component_results). Second, options for capturing baselines have changed. In the article I mostly talked about rolling your own scripts for baselining. If you’re looking to establish baselines for your servers you still have the option to develop your own scripts, but you also can use a third-party tool, and if you’re running SQL Server 2016+ or Azure SQL Database, you can use Query Store.

Read on for more details.

Comments closed

Calling Power BI REST API From Microsoft Flow

Chris Webb has started a series on calling Power BI’s REST API from Microsoft Flow.  In Part 1, he creates a custom connector:

Playing around with Microsoft Flow recently, I was reminded of the following blog post from a few months ago by Konstantinos Ioannou about using Flow to call the Power BI REST API to refresh a dataset:

https://medium.com/@Konstantinos_Ioannou/refresh-powerbi-dataset-with-microsoft-flow-73836c727c33

I was impressed by this post when I read it, but don’t think I understood quite how many exciting possibilities this technique opens up for Power BI users until I started to use it myself. The Power BI dev team are making a big investment in the API yet most Power BI users, myself included, are not developers and can’t easily write code (or PowerShell scripts) to call the API. With Flow, however, you can use the API without writing any code at all and solve a whole series of  common problems easily. In this series of blog posts I’m going to show a few examples of this.

In Part 2, Chris shows us how to automate data refreshes when source data changes:

For a while now I’ve had an idea stuck in my head: wouldn’t it be cool to build a Power BI solution where a user could enter data into an Excel workbook and then, as soon as they had done so, they could see their new data in a Power BI report? It would be really useful for planning/budgeting applications and what-if analysis. I had hoped that a DirectQuery model using the CData Excel custom connector (mentioned here) might work but the performance wasn’t good enough; using Flow with the Power BI REST API (see Part 1 of this series for details on how to get this set up) gets me closer to my goal, even if there’s still one major problem with the approach. Here’s how…

Read on for the approach as well as the major problem.

Comments closed

Reminder: Windows Server Still Exists

Allan Hirt reminds us that Windows Server is still an important product:

If you’ve been in hibernation, today you woke up to a world where Microsoft has embraced open source and Linux. What was once unthinkable is now happening. What is going on? Why am I even talking about this?

Since the introduction of SQL Server 2017 and the support for Linux-based deployments, I’ve had a steady stream of questions from C-levels on down to DBAs asking in essence this: “Do I need to abandon SQL Server on Windows Server and learn Linux?” I would use something stronger if this was a casual conversation, but the answer is an emphatic “NO!” SQL Server still runs just fine and is supported on Windows Server (including Windows Server 2019, which is just released). Support is not ending any time soon. Linux is just another option and there may be enhancements specific to each platform because of their differences. It’s not an “either/or” thing. So breathe, OK? If you have a use case for Linux, by all means deploy SQL Server on it.

I am on the SQL on Linux bandwagon and enjoy the path that Microsoft is forging, but Allan provides us a critical tonic in this regard.

Comments closed

Polling For File Existence In SQL Agent

Bill Fellows gives us an example of polling for a file change using SQL Agent:

A fun question over on StackOverflow asked about using SQL Agent with SSIS to poll for a file’s existence. As the comments indicate, there’s a non-zero startup time associated with SSIS (it must validate the metadata associated to the sources and destinations), but there is a faster, lighter weight alternative. Putting together a host of TSQL ingredients, including undocumented extended stored procedures, the following recipe could be used as a SQL Agent job step.

If you’re going to limit yourself to just SQL Server-based solutions and running on a frequent schedule isn’t enough, this is one of the better options.

Comments closed