Press "Enter" to skip to content

Category: Cloud

TempDB In Azure IaaS

Jim Donahoe shows how to use the temporary SSD on an Azure VM for SQL Server’s tempdb:

Remember, this disk is as the title of this section says…TEMPORARY! Do NOT put ANYTHING on this drive you cannot afford to lose. Don’t say nobody warned you either, because the drive itself contains a nice little txt file warning you, here is the EXACT text:

WARNING : THIS IS A TEMPORARY DISK.

Any data stored on this drive is SUBJECT TO LOSS and THERE IS NO WAY TO RECOVER IT.

Please do not use this disk for storing any personal or application data.

It’s good to see what you need to do to get this working.  I’ve found it just to be easier to set up a permanent SSD, but if you’re on a tight budget, this can save you some cash.

Comments closed

Assigning Tags In Azure

Melissa Coates shows how to assign tags to resources in Azure using Powershell:

You can assign tags for resource groups, as well as individual resources which support Azure Resource Manager. The individual resources do not automatically inherit tags from the resource group parent. A maximum of 15 key/value pairs can be assigned (though you could store concatenated values or embedded JSON in a single tag value as a workaround). You may want to just assign tags at just the resource group level, and use custom queries to “inherit” at the resource level. Alternatively, you may want to assign tags to the individual resources directly particularly if you want to see them clearly on the standard “download usage” report of billing.

Since the key/value pairs are just free-form text, watch out for uniformity issues. To improve consistency, you can utilize policies to require tags and/or apply defaults if you’d like (for example, you might want to enforce a “Created By” tag). Tags can be set in the ARM template when you initially deploy a resource (which is best so that no billing occurs without proper tagging), or afterwards to existing resources via the portal, PowerShell, or CLI.

Melissa also shows how to query those tags using Powershell.

Comments closed

Understanding Azure Billing

Denny Cherry explains some of the oddities behind Azure billing:

Some things stand out to me when looking at this bill. First is the service that is being billed. It doesn’t say how many DWs we’re being billed more (more on that later).

Something else odd is that the SQL DW shows being billed for 1220 hours. Now in theory the most number of hours for any single service you can be billed for in a single month is 744 as that’s 24 (hours) *31 (days), but here we’re being billed for 1220 hours. Now we know for a fact that this SQL DW is powered off at times, so how are we using this resource for more than 31 days in a month?

The last oddity is the rate. $1.46 an hour. This points us to the reason for all of these strange numbers. That’s the rate to run a SQL DW at DW100 for one hour.

Click through for Denny’s explanation.

Comments closed

Data Lake Tools For VS Code Updated

Jenny Jiang announces Azure Data Lake Tools for Visual Studio Code’s July update:

Local Debug enables you to debug your C# code behind, step through the code, and validate your script locally before submitting to ADLA.

  • Use command ADL: Start Local Run Service to start local run service and set a breakpoint in your code behind, then click command ADL: Local Debug to start local debug service. You can debug through the debug console and view parameter, variable, and call stack information.

Click through to see the other improvements.

Comments closed

Azure SQL Database Security Basics

Arun Sirpal explains some of the security features exposed in Azure SQL Database:

You do not really have to use all of them, this is down to you and your requirements but at least you have decisions to make. I will mention TDE (Transparent Data Encryption), I know a lot of people will opt for TDE in Azure. The big advantage of TDE in Azure over the earthed flavour is that Microsoft does a lot of the work for you, especially around the key management side of things. Also assuming your database is in a geo-replication partnership it will be protected by a different key on each of the servers. Microsoft will also rotate your certificate at least every 90 days, doing this with a local based SQL Server can be quite manual and fiddly (well I think it is).

Read the whole thing if you’re thinking of moving forward with Azure SQL Database, or if you already have a database up in Azure and haven’t checked the latest offerings yet.

Comments closed

Streaming ETL Using CDC And Event Hub

Rolf Tesmer combines Change Data Capture and Event Hubs to build a streaming ETL solution:

The solution picks up the SQL data changes from the CDC Change Tracking system tables, creates JSON messages from the change rows, and then posts the message to an Azure Event Hub.  Once landed in the Event Hub an Azure Stream Analytics (ASA) Job distributes the changes into the multiple outputs.

What I found pretty cool was that I could transmit SQL delta changes from source to target in as little as 5 seconds end to end!

There are a bunch of steps, but the end result is worth it.

Comments closed

Loan Chargeoff Templates

Ajay Jagannathan announces a couple new Cortana Intelligence Solutions Gallery templates:

For more information, read this blog: End to End Loan ChargeOff Prediction Built Using Azure HDInsight Spark Clusters and SQL Server 2016 R Service

We have published two solution templates deployable using two technology stacks for the above chargeoff scenario:-

  1. Loan Chargeoff Prediction using SQL Server 2016 R Services – Using DSVM with SQL Server 2016 and Microsoft ML, this solution template walks through how to create and clean up a set of simulated data, use 5 different models to train, select the best performant model, perform scoring using the model and save the prediction results back to SQL Server. A PowerBI report connects to the prediction table and show interactive reports with the user on the chargeoff prediction.

  2. Loan Chargeoff Prediction using HDInsight Spark Clusters – This solution demonstrates how to develop machine learning models for predicting loan chargeoff (including data processing, feature engineering, training and evaluating models), deploy the models as a web service (on the edge node) and consume the web service remotely with Microsoft R Server on Azure HDInsight Spark clusters. The final predictions is saved to a Hive table which could be visualized in Power BI.

These tend to be nice because they show you how the different pieces of the Azure stack tie together.

Comments closed

S3Guard

Mingliang Liu and Rajesh Balamohan explain why you shouldn’t use S3 as your primary Hadoop data store, as well as a tool which helps mitigate those problems:

Some of the real world use cases which can be impacted due to the S3 eventual consistency model are:

  1. Listing Files. Newly created files might not be visible for data processing. In Hive, Spark and MapReduce, this can lead to erroneous results from incomplete source data or failure to commit all intermediate results.

  2. ETL Workflow. Systems like Oozie rely on marker files to trigger the subsequent workflows. Any delay in the visibility of these files can lead to delays in the subsequent workflows.

  3. Existence-guarded path operations. Any action which fails if the destination path is present may see a deleted file in a listing, and so fail — even though the file has already been deleted.

Read on to see how S3Guard works and how to enable it in HDP 2.6.

Comments closed

Running H2O In R On Azure HDInsight

Daisy Deng shows how to configure HDInsight to be able to run the H2O package in R rather than Python or Scala:

We provide a few script actions for installing rsparkling on Azure HDInsight. When creating the HDInsight cluster, you can run the following script action for header node:

https://bostoncaqs.blob.core.windows.net/scriptaction/scriptaction-head.sh

And run the following action for the worker node:

https://bostoncaqs.blob.core.windows.net/scriptaction/scriptaction-worker.sh

Please consult Customize Linux-based HDInsight clusters using Script Action for more details.

Click through for the full process.

Comments closed

Bacpacing In Azure

Derik Hammer shows how to use a bacpac file to deploy an existing database to Azure SQL Database:

The recommended method for working with Azure is always PowerShell. The Azure portal and SSMS are tools there for your convenience but they do not scale well. If you have multiple databases to migrate, potentially from multiple servers, using PowerShell will be much more efficient. Scripting your Azure work makes it repeatable and works towards the Infrastructure as Code concept.

In this demonstration, the below steps will be used.

  1. Export the bacpac file to a local directory with sqlpackage.exe.

  2. Copy the bacpac to Azure Blob Storage with AzCopy.exe

  3. Use the PowerShell AzureRM module and cmdlets to create an Azure SQL Database from the bacpac file.

Derik shows the point-and-click way as well as the Powershell way.

Comments closed