Press "Enter" to skip to content

Category: Cloud

Cool Storage

James Serra talks about “cool storage” in Azure Blob Storage:

The access tiers available for blob storage accounts are “hot” and “cold”.  In general, hot data is classified as data that is accessed very frequently and needs to be highly durable and available.  On the other hand, cool data is data that is infrequently accessed and long-lived.  Cool data can tolerate a slightly lower availability, but still requires high durability and similar time to access and throughput characteristics as hot data.  For cool data, slightly lower availability SLA and higher access costs are acceptable tradeoffs for much lower storage costs.  Azure Blob storage now addresses this need for differentiated storage tiers for data with different access patterns and pricing model.  So you can now choose between Cool and Hot access tiers to store your less frequently accessed cool data at a lower storage cost, and store more frequently accessed hot data at a lower access cost.  The Access Tier attribute of hot or cold is set at an account level and applies to all objects in that account.  So if you want to have both a hot access tier and a cold access tier, you will need two accounts.  If there is a change in the usage pattern of your data, you can also switch between these access tiers at any time.

It looks like there shouldn’t be a performance difference between the two;  it’s more of a cost difference in which you might be able to save money by choosing your tier wisely.

Comments closed

Azure SQL Database Management With Powershell

Mike Fal shows a few administration steps with Azure SQL Database, including resetting an admin password:

Walking through this, we just need to create a secure string for our password and then use the Set-AzureRmSqlServer cmdlet and pass the secure string to -SqlAdministratorPassword argument. Easy as that and we don’t even need to know what the previous password was. With this in mind, I also want to call out that you can only change the password and not the admin login name. While this is not such a big deal, be aware that once you have an admin login name, you are stuck with it.

Mike promises that his next blog post won’t take a month to publish.  Here’s hoping he’s right.

Comments closed

Using Flow + Power BI For Reporting

John White combines Microsoft Flow with Power BI to generate real-time reports:

I may blog about that solution in the future, but with the Future of SharePoint event rapidly coming up, my BI Focal fellow collaborator, Jason Himmelstein convinced me that there was something more interesting that we could do with this. How about near real time monitoring of Twitter conversations for the event? All of the pieces were in place.

We rolled up our sleeves, and in relatively short order, had a solution. Jason has written about the experience on his SharePoint Longhorn blog, and he has included the videos that we put together, so I can be a little less detailed in this post.

Today I learned about Microsoft Flow.  Looks a lot like Zapier.

Comments closed

Recalculating Days

Brian Mitchell shows how to re-calculate prior days in Azure Data Lake using partitioning:

The question is what is the right time period to use? The answer is it depends on the size of your partitions.  Generally, for managed tables in U-SQL, you want to target about 1 GB per partition.  So, if you are bringing in say 800 mb per day then daily partitions are about right.  If instead you are bringing in 20 GB per day, you should look at hourly partitions of the data.

In this post, I’d like to take a look at two common scenarios that people run into.  The first is full re-compute of partitions data and the second is a partial re-compute of a partition.  The examples I will be using are based off of the U-SQL Ambulance Demo’s on Github and will be added to the solution for ease of your consumption.

The ability to reprocess data is vital in any ETL or ELT process.

Comments closed

DocumentDB Updates

Jen Stirrup brings news on DocumentDB updates:

Let’s have a look at the biggest news: DocumentDB now has protocol support for MongoDB. From the industry perspective, this is great news. MongoDB is one of the most easily-recognised NoSQLs databases. Now that DocumentDB can be a great supporting act for MongoDB, it means that architects have a broader range of tools to support business needs in the enterprise data architecture, whilst increasing business capability using the Azure cloud. How does it work? Well, or MongoDB at the wire protocol level, which means that the existing MongoDB drivers will function against DocumentDB. For IT departments, it means that enterprises can persist data in DocumentDB behind the scenes. With it, it brings the reliability, stability and resilience of the Azure cloud. It also means that these technologies are accessible for small to medium enterprises in a way that they can afford, backed up with the stability and support from Microsoft Azure that they may find difficult to service on their own.

DocumentDB has a long way to go before it catches up to MongoDB, but these are improvements that close the gap a little bit.

Comments closed

Azure Data Catalog Test Environment

Melissa Coates shows how to create a test environment for Azure Data Catalog:

Azure Data Catalog is a Software as a Service (SaaS) offering in Azure, part of the Cortana Intelligence Suite, for registering metadata about data sources. Check this post for an overview of Azure Data Catalog key features. (I’m a big fan of what Azure Data Catalog is trying to accomplish.)

There are a couple of particulars about Azure Data Catalog which make it a bit more difficult to set up a Demo/Test/Learning type of environment, including:

  • You are required to sign into Azure Data Catalog with an organizational account. Signing in with a Microsoft account (formerly known as a Live account) won’t work for Azure Data Catalog authentication, even if that’s what you normally use for Azure.

  • One Azure Data Catalog may be created per organization. Note this is *not* per Azure subscription – if your account has access to multiple subscriptions, it’s still one catalog per organization.

This method seems, shall we say, overly complicated.  Here’s hoping that the Azure Data Catalog team find a way to improve this experience.

Comments closed

Azure SQL Database Pricing

James Serra explains Azure SQL Database pricing:

DTU’s are explained at here.  To help, there is a Azure SQL Database DTU Calculator.  This calculator will help you determine the number of DTUs being used for your existing on-prem SQL Server database(s) as well as a recommendation of the minimum performance level and service tier that you need before you migrate to Azure SQL Database.  It does this by using performance monitor counters.

After you use a SQL Database for a while, you can use a pricing tier recommendation tool to determine the best service tier to switch to.  It does this by assessing historical resource usage for a SQL database.

For further information, check out this interesting article from a few months ago on V12 performance by Chris Bailiss.

Comments closed

IT Pro Cloud Essentials Program

Grant Fritchey talking about a new Microsoft program:

You need to start working on adding Azure knowledge to your skill set. If you have access to an MSDN license, getting into Azure is easy because of the credits available. However, not everyone works for a company that provides MSDN or has purchased a license. In that case, getting into Azure, just for testing and learning could be expensive (I frequently “spend” $150/month with my MSDN credits). However, Microsoft is very serious about getting everyone moved into this space. They’ve launched a new free program called IT Pro Cloud Essentials. Not only does it give you Azure credit, but you also get access to O365, another set of skills and understanding you need to acquire.

Also check out their Visual Studio Dev Essentials program.  Its Azure credit is only $25 a month, but offers you SQL Server 2014 (and will offer 2016) Developer Edition.

Comments closed

Azure SQL Data Wareouse

Robert Sheldon has a getting started guide for Azure SQL Data Warehouse:

In my Simple-Talk article Azure SQL Data Warehouse, I introduced you to SQL Data Warehouse and gave you an overview of the architecture and technologies that drive the service and make it all work. In this article, I go a step further and provide details about getting started with SQL Data Warehouse, demonstrating how to add a sample database and then accessing the server and database settings.

If you want to follow along with my examples and try out SQL Data Warehouse for yourself, you must have an active Azure subscription, even if it’s just the free trial. For those who have already used up their free trial, be aware that SQL Data Warehouse is a pay-as-you-go service, even though it’s still in preview, so unless you’re on an unlimited company budget or happen to have accrued MSDN credits, you’ll want to be judicious in how you try out the service. Fortunately, as you’ll see in this article, you can pause the compute resources when not in use, helping to minimize the costs associated with learning about the service.

This article is all about initial installation and configuration.

Comments closed

WhoIsActive For Azure

Adam Machanic has a new version of sp_whoisactive specifically for Azure SQL Database:

So I set about looking for a workaround. This week I think I’ve finally managed to get something working that approximates the number I need from that view, ms_ticks.

Attached is sp_whoisactive v11.112 — Azure Special Edition v2. Please give it a shot, and I am especially interested in feedback if you use the @get_task_info = 2 option when running sp_whoisactive. That is the main use case that’s impacted by the lack of ms_ticks information and my attempt at a workaround.

If you’re using on-prem SQL Server, this doesn’t add anything new, but if you’re on Azure SQL Database, give it a try.

Comments closed