Press "Enter" to skip to content

Day: April 29, 2019

Data Layout in R with cdata

John Mount takes us through a few sample problems and how to reshape data with cdata:

This may seem like a lot of steps, but it is only because we are taking the problems very slowly. The important point is that we want to minimize additional problem solving when applying the cdata methodology. Usually when you need to transform data you are in the middle of some other more important task, so you want to delegate the details of how the layout transform is implemented. With cdata the user is not asked to perform additional puzzle solving to guess a sequence of operators that may implement the desired data layout transform. The cdata solution pattern is always the same, which can help in mastering it.

With cdata, record layout transforms are simple R objects with detailed print() methods- so they are convenient to alter, save, and re-use later. The record layout transform also documents the expected columns and constants of the incoming data.

Check it out.

Comments closed

Exploratory Analysis of Earthquake Data

Giorgio Garziano walks us through an earthquake data set:

Boxplots for each quantitative variables are shown. We take advantage of the quantitative variable names (quantitative_vars) determined before to apply a ggplot2 package based boxplot function. The Y axis labeling and title are determined by the variable to be plot. Further, legend is not displayed and we adopt the coordinate flip option for improved readability.

Check it out to get an idea of how to do exploratory data analysis.

Comments closed

Azure SQL Linux VM Configuration with dbatools

Rob Sewell walks us through configuring SQL Server on an Azure VM running Linux, installing Powershell, and using dbatools:

I had set the Network security rules to accept connections only from my static IP using variables in the Build Pipeline. I use MobaXterm as my SSH client. Its a free download. I click on sessions

There wasn’t much I could excerpt here, but this is a heavily screenshot-driven tutorial.

Comments closed

Defining Data Egress Charges with Azure Query Editor

Dave Bland looks into what constitutes data egress with Azure and looks at a specific example:

If you have attempted to calculate your price for your Azure environment, you know that the pricing can be complex, taking into account a number of factors.  These factors include data egress, compute and storage.  The intent of the blog is not to outline all the billing factors of Azure.  The purpose of this blog post is to answer one question.

When I use the Azure SQL Query Editor, does that count as part of the data egress charges?

I completed a blog post on the Azure Query Editor a few weeks back.  Here is a link, if you would like to check it out.

Read on to see what Dave learned.

Comments closed

Cancelling Power Query Refreshes

Imke Feldmann shares how you can stop a data refresh in Power Query without losing the work you’ve done in the designer:

If you’re working with large data or complex queries that take a long time refresh, cancelling one of those refreshes can even take longer time, especially, if the query has run for quite some time already.

Luckily, there is an easy trick to cancel refresh without loosing the work you’ve done already

Read on to see how.

Comments closed

Linked Servers and Remote Insertion

Max Vernon recommends a pull rather than push model when you need to insert cross-server:

Linked Servers offer a great way to connect two SQL Servers together, allowing remote querying and DML operations. Frequently, this is used to copy data from production to reporting. However, the temptation is to run the copy operation on the production, or source side. If you do that, even with a single INSERT INTO statement, SQL Server will process each individual row as a discrete INSERT INTO statement via a cursor operation. This makes for very slow inserts across a linked server. Running the operation from the destination server means SQL Server can simply query the remote source table for all the rows, inserting them as a set into the destination table. The difference in speed can be eye-watering.

Click through for a slightly creepy picture and a less creepy example.

Comments closed