Press "Enter" to skip to content

Category: Cloud

BACPACs In Azure

Mike Fal uses Powershell to import and export Azure SQL Database data:

What makes this black voodoo magic work? Is this some proprietary technique Microsoft has snuck in on us? Surprisingly, this is a bit of technology that have existed for sometime now as part of SQL Server Data Tools called BACPACs. A BACPAC is essentially a logical backup of a database, storing the schema and data as SQL statements.

This differs from a typical SQL Server backup, which stores your database pages directly in a binary format. Because of this, native backups are smaller and can be made/restored faster. However, they are more rigid, as you can only restore a native backup in specific scenarios. A logical backup, since it is a series of SQL statements, can be more flexible.

Mike’s going to follow up with a way to take advantage of this to migrate normal SQL Server databases, so that should be interesting as well.

Comments closed

Building Blocks Of Cortana Intelligence Suite

Melissa Coates has put together a new presentation on the building blocks of the Cortana Intelligence Suite:

Each section will wrap up with an example of the ‘building blocks’ to formulate a solution. Although these ‘building blocks’ examples are greatly simplified, my hope is it will generate ideas for how the different Azure components can fit together for formulating hybrid solutions.

Check it out, as there are a lot of pieces.

Comments closed

Cortana Intelligence Suite

Buck Woody discusses various components of the Cortana Intelligence Suite:

It’s not a simple matter of “choose one from column B and two from column A” – you have to learn the processes, and then the tools, and then think about your situation. In other words, some things are complicated because they are…complicated. However:

There are some things you can consider out of the box. So I spoke with my friend Romit Girdhar while we were co-teaching in London last week, and he put together a great visualization. You can see them here, and download the PDF below. Thanks, Romit!

And of course they had to change the name—it wouldn’t be a Microsoft product if the name didn’t change every six months…

Comments closed

Threat Modeling

Michael Howard discusses threat modeling in Azure:

Many conversations I have with customers go like this:

Customer: “We cannot deploy on Azure until we know that appropriate defenses are in place.”

Me: “I agree 100%, so let’s build a threat model for the proposed design and see what you need to do, and what Microsoft provides.”

A couple of days pass as we build and iterate on the threat model.

Now here’s when the customer has an “a-ha” moment. At the end of the process we have a list of defenses for each part of the architecture and we all agree that the defenses are correct and appropriate.

It’s at that point the customer realizes that they can deploy a cloud-based solution securely.

My tongue-in-cheek response is, of course a customer can deploy a cloud-based solution securely if they have Michael Howard walking them through it.  Michael does include some links on Azure security configuration and threat modeling resources so check those out.

Comments closed

Deploying To Azure SQL Database

Julie Smith shows how to deploy a database (AdventureWorksDW) out to Azure:

This is telling us that four of the tables in the sample do not have clustered indexes. Azure SQL Database insists on a clustered index for every table. So without warranty, here is a script that I used to refactor my on-prem AdventureWorksDW2014 database. After making these fixes, I was able to deploy to Azure SQL DB from SSMS with no errors.

Julie includes the script she used to fix up AdventureWorksDW.

Comments closed

Dealing With DocumentDB

Rolf Tesmer has an example of using DocumentDB:

How do you work out what DocumentDB will cost?  

Well, 1 RU corresponds to a GET of a 1KB document.  You then need to know the specifics of your workload such as number of documents, avg document size, how the documents are accessed (query patterns), and how often documents are updated.

I suggest loading your data and running your standard queries in the Azure Portal (see below) to see what RU’s your workload generates, and use that in your calculations.

Rolf also has some comparisons to SQL Server queries and some hints with integrating SQL Server with DocumentDB.

Comments closed

Restoring Azure SQL Databases With Powershell

Mike Fal shows us how to restore an Azure SQL Database database using Powershell:

The most fundamental form of disaster recovery is database backups and restores. Typically setting up backups is a lot of work. DBAs need to make sure there’s enough storage available for backups, create schedules that accommodate business operations and support RTOs and RPOs, and implement jobs that execute backups according to those schedules. On top of that, there is all the work that has to be done when backups fail and making sure disk capacity is always large enough. There is a huge investment that must be made, but it is a necessary one, as losing a database can spell death for a company.

This is one of the HUGE strengths of Azure SQL Database. Since it a service offering, Microsoft has already built out the backup infrastructure for you. All that stuff we talked about in the previous paragraph? If you use Azure SQL Database, you do not have to do any of it. At all.

What DBAs still need to manage is being able to restore databases if something happens. This is where Powershell comes into play. While we can definitely perform these actions using the portal, it involves a lot of clicking and navigation. I would much rather run a single command to run my restore.

The Powershell cmdlets are easy to use, so spin up an instance and give it a try.

Comments closed

Connecting To SQL Data Warehouse

Robert Sheldon looks at ways to connect to Azure SQL Data Warehouse:

Unlike SSMS, Microsoft does support connecting to SQL Data Warehouse from Visual Studio, via the database engine features in SSDT. When you get into the Visual Studio/SSDT environment, open SQL Server Object Explorer, which is similar to Object Explorer in SSMS. From there, click the Add SQL Server button.

When the Connect dialog box appears, provide the server name, select SQL Server Authentication, and then specify the login name and password, as shown in the following figure.

It is a bit surprising that you can’t easily connect via SSMS 2014.  Maybe that’s changed with SSMS 2016?

Comments closed

Azure SQL Database Q&A

Julie Koesmarno has a Q&A on Azure SQL Database:

Q: Is there going to be down time when I scale up/down? What’s going to happen to my existing connections?

Extracted from Change the service tier and performance level (pricing tier) of a SQL database:

Note that changing the service tier and/or performance level of a database creates a replica of the original database at the new performance level, and then switches connections over to the replica.No data is lost during this process but during the brief moment when we switch over to the replica, connections to the database are disabled, so some transactions in flight may be rolled back. This window varies, but is on average under 4 seconds, and in more than 99% of cases is less than 30 seconds. Very infrequently, especially if there are large numbers of transactions in flight at the moment connections are disabled, this window may be longer.

The duration of the entire scale-up process depends on both the size and service tier of the database before and after the change. For example, a 250 GB database that is changing to, from, or within a Standard service tier, should complete within 6 hours. For a database of the same size that is changing performance levels within the Premium service tier, it should complete within 3 hours.

Video by Joe Idziorek on Service Tiers and how to scale up and down using Azure Portal is available here.

Read the whole thing.  There are some great questions and answers in this set.

Comments closed

Microsoft Atop Hadoop Cloud Solutions

Forrester has named Microsoft a leader in the Hadoop cloud solutions space:

This week, we’re excited that Forrester recognized Microsoft Azure as a leader in their Big Data Hadoop Cloud Solutions. Apache Hadoop as a technology has become popular amongst organizations to unlock insights from data of all size, shape, and speed. Hadoop power solutions to help businesses improve their performance, educators to better connect with the needs of their students, medical professionals to improve the quality of their care, or researchers to accelerate new advancements in science.

As an example, Ultra Tendency uses Hadoop to achieve something not possible before – visualize more than 27 million distinct sensor readings to give Japanese citizens accurate, up-to-date information about the radiation contamination from the Fukushima nuclear plant meltdown. More and more organizations are also deploying Hadoop in the cloud with 47% of Forrester’s respondents to a 2015 survey increasing their cloud deployments either by 5-10% (37%) or more than 10% (10%).1 This makes sense because the cloud allows you to scale elastically on demand to handle the processing of any amount of data.

AWS and IBM also have very good solutions, and Google is trying to get a stronger foothold on the cloud game.

Comments closed