Real-Time Power BI Dashboards

Reza Rad builds a real-time dashboard with Stream Analytics and Power BI:

IoT Devices or Applications can pass their data to Azure Event Hub, and Azure Event hub can be used as an input to Azure Stream Analytics (which is a data streaming Azure service). Then Azure stream analytics can pass the data from input based on queries to outputs. If Power BI be used as an output then a dataset in Power BI will be generated that can be used for real-time dashboard.

As a result anytime a new data point from application or IoT device comes through Event hubs, and then Stream Analytics, Power BI dashboard will automatically update with new information.

This is a pretty nice weekend project.

Lubridate Updates

Hadley Wickham reports on a Lubridate update:

  • Date time rounding (with round_date()floor_date() and ceiling_date()) now supports unit multipliers, like “3 days” or “2 months”:

    ceiling_date(ymd_hms("2016-09-12 17:10:00"), unit = "5 minutes")#> [1] "2016-09-12 17:10:00 UTC"

If you handle date and time data in R, Lubridate is a tremendous asset.

Notebook Practices

Jonathan Whitmore has good practices for managing Jupyter notebooks:

Here’s an example of how we use git and GitHub. One beautiful new feature of Github is that they now render Jupyter Notebooks automatically in repositories.

When we do our analysis, we do internal reviews of our code and our data science output. We do this with a traditional pull-request approach. When issuing pull-requests, however, looking at the differences between updated .ipynb files, the updates are not rendered in a helpful way. One solution people tend to recommend is to commit the conversion to .py instead. This is great for seeing the differences in the input code (while jettisoning the output), and is useful for seeing the changes. However, when reviewing data science work, it is also incredibly important to see the output itself.

So far, I’ve treated notebooks more as presentation media and used tools like R Studio for tinkering.  This shifts my priors a bit.

Restoring All Databases

Kevin Hill builds a script to reload all databases at once:

We are doing a major upgrade this weekend, so like any good DBA, I have planned for a full backup the night before and needed the ability to quickly restore if it goes sideways and needs to roll back.

The catch is that there are close to 300 identical databases involved.

This is easy enough to do if you know where info is stored in MSDB related to the last backup.

Click through for the script.

Getting On The Biml Horse

Kevin Feasel

2016-09-16

Biml

Bill Fellows gives three patterns for Biml adoption:

Metadata driven

This approach “puts it all together.” From bottom to top, we take the Biml files we developed in the Cyclical phase and make them into “patterns.” That doesn’t have to be a complex endeavor, it could be as simple as putting a variable in for package name.

This is a nice stab at an organizational Biml maturity model.

Azure SQL Database Size Quotas

Dimitri Furman discusses the MAXSIZE property on an Azure SQL Database:

Customers can use this ability to allow scaling down to a lower service objective, when otherwise scaling down wouldn’t be possible because the database is too large.

While this capability is useful for some customers, the fact that the actual size quota for the database may be different from the maximum size quota for the selected service objective can be unexpected, particularly for customers who are used to working with the traditional SQL Server, where there is no explicit size quota at the database level. Exceeding the unexpectedly low database size quota will prevent new space allocations within the database, which can be a serious problem for many types of applications.

One more thing to think about, I suppose.

Trained Python Models

Koos van Strien wants to bring a trained Python model into Azure ML:

The path of bringing a trained model from the local Python/Anaconda environment towards cloud Azure ML is globally as follows:

  1. Export the trained model

  2. Zip the exported files

  3. Upload to the Azure ML environment

  4. Embed in your Azure ML solution

Click through to see the details.  Koos did a great job making it look easy.

Bubble Charts

Devin Knight continues his custom visuals series with the bubble chart:

  • Each category is defined as a bubble on the visual.

  • The size of the bubble is defined by a measure.

  • The Bubble chart does allow for cross filtering.  Meaning you can click on a bubble in the chart and it will filter other report items

This is a nice eye-catcher image, like you might see in a news article.

Restoring An Azure SQL Database

Arun Sirpal discusses ways to restore a database within Azure SQL Database:

You won’t have the ability to use the same name of the restoring database and the database that you want to replace; if you try you get the screen shot below: To get around this I think you would need to drop the old one once the new one has restored then do a rename.

This is a big difference compared to the on-prem version, so be sure to practice this before you find yourself in a crisis.

Migrating To Azure SQL Database

Mike Fal discusses BACPACs, DACPACs, and migrating on-prem databases to Azure SQL Database:

SQL Server Data Tools(SSDT) have always had a process to extract your database. There are two types of extracts you can perform:

  • DACPAC – A binary file that contains the logical database schema and possibly the data. This file retains the platform version of the database (i.e. 2012, 2014, 2016).

  • BACPAC – A binary file that contains the logical database schema and the data as insert statements. This stores the platform version, but is not locked into it.

Mike also walks through SqlPackage.exe.

Categories

September 2016
MTWTFSS
« Aug Oct »
 1234
567891011
12131415161718
19202122232425
2627282930