Press "Enter" to skip to content

Category: Cloud

Migrating To Azure SQL Database

Tim Radney walks us through steps to migrate an on-prem database to Azure SQL Database:

When planning to migrate on premises databases to V12, the size of the database is a huge factor in how long the migration will take. The export of the database, the transfer of the data, and the import will all increase in proportion to the size of the database.

Another big factor in the restore/import time when moving your databases to V12 is the performance tier you are restoring too. The restore/import process requires a lot of horsepower, so to help expedite your migration, you should consider restoring to a higher performance tier. When the database is online, you can easily and quickly drop down to a lesser tier that meets your daily performance needs. Being able to change performance tiers with a few mouse clicks is one of the big benefits of Azure SQL Database.

There are some design considerations for moving to Azure SQL Database, and once those are covered, Tim’s article helps with the actual migration process.

Comments closed

Moving To The Cloud

Denny Cherry explains some of the important indicators that you might benefit from moving to a cloud provider:

The reality is that not all workloads are a right fit for the cloud. If you are running highly sustained workloads, then the cloud probably isn’t the right solution for your environment. The systems which work best in the cloud are the ones which can be converted to use the various PaaS services not just the IaaS services. The PaaS services will usually provide a more cost effective hosting solution, especially for workloads which vary over time; for example, ramping up during the day and becoming almost idle overnight.

Even if running in a PaaS environment isn’t an option this may be cost effective for running in an IaaS environment. It all depends on how bursty the workload is that you plan on moving to the cloud.

There are some good points here; check it out.

Comments closed

Restoring An Azure SQL Database

I have a post on restoring a database in Azure SQL Database:

You will need to select your restore point as well.  In this case, I decided to restore back to midnight UTC on a particular date.  Note that the dates are UTC rather than your local timezone!

After selecting your restore point, you pick the target server and can decide a couple of things.  First, you can put this database into an elastic database pool, which makes cross-database connections a lot easier.  Second, you can choose a different pricing tier.  Because I only needed this database for a few minutes, keeping it at P2 Premium was fine; the total restore time meant that we spent less than a dollar bringing the data back to its pristine condition.

Be aware of the time for restoration; it can be very slow.

Comments closed

Regular Expressions In Lucene

Kendra Little looks at Azure Search searches:

I wanted to be able to find all architect jobs using something like ‘%rchit%’ as well, because there’s not a lot of great ways to do this in SQL Server.

In SQL Server, you can use a traditional B-Tree index to seek, but only based on the letters at the beginning of a character column.  If I want to know every business title that contains ‘%rchit%’, I’m going to have to scan an entire index.

SQL Server fulltext indexes don’t solve the double-wildcard problem, either. Fulltext indexes support word prefix searches– so a fulltext index would be great at finding all job titles that contain a word that starts with ‘Arch%’.

Sometimes that’s enough. But a lot of times, you do need to find a substring anywhere in a word. And sometimes you do want to offload that from your database.

This is the kind of problem Lucene (and its follow-up implementations, like Elasticsearch) was designed to solve.  Read on for more details as Kendra solves the problem in Azure Search.

Comments closed

Analysis Services In Azure

Chris Webb looks at SSAS in Azure:

Support for multidimensional models will be considered for a future release, based on customer demand.

I’m pretty sure there there will be plenty of demand for Multidimensional support given the installed base that’s out there.

I hope so.  Lack of multidimensional isn’t a deal-killer, but it’s a deal-harmer.

Comments closed

Stretch Database Authentication Failures

Jack Li walks through a bug in Stretch database:

The message provided enough directions.  It says either you have a bad login or firewall setting on the Azure DB Server side is not configured correctly.     The very first thing is to ensure the Firewall was configured correctly.   We even tried 0.0.0.0. to 255.255.255.255. But it didn’t resolve the issue.

Next we created a brand new database on the same server and tried on that one.  It worked.  But customer just couldn’t get the old database to work even she made sure that she could use the login/password to log in using SSM on the same server to the Azure DB server.

On the same server, brand new database worked but the old database didn’t.   So that made me wonder what happens if I manually cause an failure and later retry.

Read on for the repo and solution.

Comments closed

Azure Data Lake Updates

Michael Rys has the October updates for Azure Data Lake:

We seem to be just cranking out new stuff :). Here are the October 2016 Updates for Azure Data Lake U-SQL!

The main take away is that the October refresh has now removed the old deprecated syntax of the items we have announced over the last couple of release notes!

Thanks for those who volunteered to test the new version of more scalable file set. Please contact us if you want to try it and help us validate it.

Click through for the release notes.

Comments closed

Azure Data Lake Analytics Units

Yan Li explains the Azure Data Lake Analytics Unit:

An Azure Data Lake Analytics Unit, or AU, is a unit of computation resources made available to your U-SQL job. Each AU  gives your job access to a set of underlying resources like CPU and memory. Currently, an AU is the equivalent of 2 CPU cores and 6 GB of RAM. As we see how people want to use the service, we may change the definition of an AU or more options for controlling CPU and memory usage.

How AUs are used during U-SQL Query Execution

When you submit a U-SQL script for execution, the U-SQL compiler parallelizes the U-SQL script into hundreds or even thousands of tasks called vertices. Each vertex is allocated to one AU. The AU is dynamically allocated to the task and released once that particular task is completed.

I appreciate the ADL team’s transparency in how they define a unit.  It’s much nicer to be able to tell someone that an AU is 2 CPU cores + 6 GB of RAM, rather than saying it’s some fuzzy measure of CPU + memory + I/O which has no direct bearing on your operations.

Comments closed

Database Throughput Units

Randolph West looks at the Azure Database Throughput Unit Calculator:

The DTU Calculator, a third-party service created by Justin Henriksen (a Microsoft employee), will calculate the DTU requirements for our on-premises database that we want to migrate to Azure, by firstly capturing a few performance monitor counters, and then performing a calculation on those results, to provide the recommended service tier for our database.

Justin provides a command-line application or PowerShell script to capture these performance counters:

  • Processor – % Processor Time

  • Logical Disk – Disk Reads/sec

  • Logical Disk – Disk Writes/sec

  • Database – Log Bytes Flushed/sec

For more details on DTUs, John Sterrett looks at the math.

Comments closed