The whole query is below. Right now, let’s just focus on the secret sauce. The secret sauce is how DTU percentage gets calculated. In a nutshell, the maximum of CPU, Data IO, Log Write Percent determine your DTU percentage. What does this mean to you? Your max consumer limits you. So, you can be using 1% of your IO but still be slowed down because CPU could be your max consumer resource.
That’s a rather interesting finding. I think the next step (which may be so context-dependent that it’s not possible to generalize) might be to figure out what various workloads do to the metrics and if there’s a way to predict with some reasonable accuracy the expected DTU load given an anticipated change in workload, rather than seeing the value spike and reacting to it later.
Sometimes when firing up VMs or moving VMs from the page or blob store you’ll get an error that there is still a lease on the file. To solve this you need to release the lease. But waiting won’t do the trick, as the leases don’t have an expiration date.
Click through for the code.
It turns out it’s pretty easy (even if it takes some time). So where to start? Well the first thing we need is a place to put our database. An Azure SQL Database Server. If you don’t already have one creating a new one is fairly easy.
First start at portal.azure.com. Log in and follow these steps
This is the longer, manual process. It’s good to walk through it this way at least once before writing a Powershell script, just to see what the script is doing.
So let’s get down to brass tacks and actually create an alert. To do this, we need some info first:
The Resource Group we will create the alert in.
An Azure location where the alert will live.
An Azure SQL Database server and database we are creating the alert for.
What metric we will monitor and what is the threshold we will be checking.
(optional) An email to send an alert to.
Mike follows this up with code and shows it’s not scary at all to create these alerts from within Powershell.
Before I started, I was already quite comfortable programming Python and did some R programming in the past. This turned out pretty handy, though not really needed to start off with – because starting with Azure ML, the data flow can be created much like BI specialists are used to in SSIS.
A good place to start for me was the Tutorial competition (Iris Petal Competition). It provides you with a pre-filled workspace with everything in place to train and test your first ML model:
I’d like to see Azure ML get more traction; I’m not optimistic that it will.
This article focuses on migrating data to Azure SQL Data Warehouse with tips and techniques to help you achieve an efficient migration. Once you understand the steps involved in migration, you can practice them by following a running example of migrating a sample database to Azure SQL Data Warehouse.
Migrating your data to Azure SQL Data Warehouse involves a series of steps. These steps are executed in three logical stages: Preparation, Metadata migration and Data migration.
It’s a lengthy read, but well worth it.
It gives you a map on how to manage your security as you move into the cloud. Note: one of the main points is that your on premise security is equally as important and has to be managed with and as a part of your cloud security.
Now if you are like me and want more than just dry reading they also provide a link to a Microsoft Virtual Academy training course called Security in a Cloud-Enabled World that follows this roadmap and provides more detail and guidance.
Read the whole thing.
Before I’m going into detail, I want to give full kudos to Ola Hallengren (Website | @olahallengren). He has spend a lot of his time to build a SQL Server Maintenance Solution that is completely free for everyone to use. And he did such an excellent job a lot companies (also huge companies) use his solution to run maintenance tasks on their databases.
None of the scripts below are written by me, but only small changes are made in order to make things more clear when the solution is deployed to an environment. The original scripts can be downloaded via the download page on Ola’s website.
Most of the to-dos are the same between on-premises and Azure SQL DB, but some of the implementation steps are a bit different. This is worth checking out if you have any Azure SQL Database instances.
The first thing to keep in mind is that ASDW was designed to be a cloud based system. As such, it aims to be very flexible for resource allocation and very efficient to scale up or down. To meet those goals, the system allows you too:
Increase or decrease compute power represented by Data Warehousing Units.
The amount of storage can grow and is charged independently from the compute power.
The compute power can be completely paused and only storage is payed at that point.
Warner also has a brand new Pluralsight course on the topic.
The Linked Service for ML is going to need some information from the Web Service, the URL and the API key. Chances are neither of these have been committed to memory, instead open up Azure ML, go to Web Service and copy them. For the URL, look under the API Help Pagegrid, there are two options, Request/Response and Batch Execution. Clicking on Batch Execution loads a new page Batch Execution API Document. The URL can be found under Request URI. When copying the URL, you do not need to include any text after the word “jobs”. The rest of the URL, “?api-version=2.0”. Copying the entire URL will cause an error. Going back to the web Services page, The API Key appears on the dashboard section of Azure ML and there is a convenient button for copying it. Using these two pieces of information, it is now possible to create the Data Factory Linked Service to make the connection to the web service, which here I called AzureMLLinkedService
Read the whole thing.