Press "Enter" to skip to content

Category: Cloud

Using AZCopy for SQL Backups

John McCormack shows how you can use AZCopy to move SQL Server backups into Azure Storage:

AZCopy is a useful command line utility for automating the copying of files and folders to Azure Storage Account containers. Specifically, I use AZCopy for SQL Backups but you can use AZCopy for copying most types of files to and from Azure.

In this blog post example (which mirrors a real world requirement I had), the situation is that whilst I need to write my SQL backups over a network share, I also want to push them up to Azure Storage (in a different region) to allow developers quicker downloads/restores. This is why I need to use AZCopy. If I only needed my backups to be written to Azure, I could have used BACKUP TO URL instead.

Read on to see how John did it.

Comments closed

Changes to Azure SQL Database SLA

Arun Sirpal notes a change to the Azure SQL Database Service Level Agreement:

I am sure many missed the updates to Azure SQL Database SLA (Service Level Agreement). It used to be 99.99% across all tiers  but split between two different high-availability architectural models. Basic, Standard and General Purpose tiers had its own model and the Premium / Business Critical tiers had a different one.

Read on to see the change.

Comments closed

Oracle Data Guard on Azure

Kellyn Pot’vin-Gorman’s worlds continue to collide:

So, as most people know, I’m not a big fan of Oracle RAC, (Real Application Cluster).  My opinion was that it was often sold for use cases that it doesn’t serve, (such as HA) and the resource demands between the nodes, as well as what happens when a node is evicted to those that are left are not in the best interest for most use cases.  On the other hand, I LOVE Oracle Data Guard, active or standard, don’t matter, the product is great and it’s an awesome option for those migrating their Oracle databases to Azure VMs.

Read on to see what Oracle Data Guard is and where you might use it.

Comments closed

Notebooks in Azure Databricks

Brad Llewellyn takes us through Azure Databricks notebooks:

Azure Databricks Notebooks support four programming languages, Python, Scala, SQL and R.  However, selecting a language in this drop-down doesn’t limit us to only using that language.  Instead, it makes the default language of the notebook.  Every code block in the notebook is run independently and we can manually specify the language for each code block.

Before we get to the actually coding, we need to attach our new notebook to an existing cluster.  As we said, Notebooks are nothing more than an interface for interactive code.  The processing is all done on the underlying cluster.

Read on to learn how Databricks uses the notebook metaphor heavily in how you interact with it.

Comments closed

Logging in Azure

Rolf Tesmer has a detailed post covering how and what to log when using Azure for a modern data warehouse:

In my view – what often doesn’t get enough attention up front are the critical aspects of monitoring, auditing and availability. Thankfully, these are generally not too difficult to plug-in at any point in the delivery cycle, but as like with most things in cloud there are just so many different options to consider!

So the purpose of this blog is to focus on the key areas of Azure Services Monitoring and Auditing for the Azure Modern Data Platform architecture.

Click through for examples from a number of different Azure services.

Comments closed

Azure SQL Database Hyperscale

Jeroen ter Heerdt explains the basics behind Azure SQL Database Hyperscale:

Connecting to your Hyperscale database is exactly the same as any other Azure SQL or SQL Server database – for example, you can use SQL Server Management Studio or Azure Data Studio. That is the exactly point. Hyperscale provides capabilities not found in other cloud databases such as scale and query performance, but it has no impact on your applications. If you do an in-place migration of a SQL database that is already on another service tier to Hyperscale, the connection string would not even change. Your application does not have to be changed and it will benefit from bigger scale and improved query performance. And who does not like the extra scale and query performance? 

This post covers the basics of database creation but does not go much further into it than that. Jeroen does promise a multi-part series, however.

Comments closed

Creating Azure SQL Elastic Jobs

Arun Sirpal takes us through Elastic Jobs against Azure SQL Databases:

The purpose of an Elastic Job is to execute a T-SQL script that is scheduled or executed ad-hoc against a group of Azure SQL databases.  Targets can be in different SQL Database servers, subscriptions, and/or regions. This blog post is quite long and heavy (code wise) so grab a coffee and follow through.

The architecture you could follow is shown below.

All of the code is in Powershell and Arun talks us through it.

Comments closed

Amazon Elasticsearch Alerts

Jon Handler shows how to create alerts for Amazon Elasticsearch Service:

On April 8, Amazon ES launched support for event monitoring and alerting. To use this feature, you work with monitors—scheduled jobs—that have triggers, which are specific conditions that you set, telling the monitor when it should send an alert. An alert is a notification that the triggering condition occurred. When a trigger fires, the monitor takes action, sending a message to your destination.

This post uses a simulated IoT device farm to generate and send data to Amazon ES.

Click through for a demo.

Comments closed

Feeding IoT Data into Delta Lake

Saeed Barghi shows how you can stream sensor data from Azure IoT Hub into Databricks Delta Lake:

IoT devices produce a lot of data very fast. Capturing data from all those devices, which could be at millions, and managing them is the very first step in building a successful and effective IoT platform.

Like any other data solution, an IoT data platform could be built on-premise or on cloud. I’m a huge fan of cloud based solutions specially PaaS offerings. After doing a little bit of research I decided to go with Azure since it has the most comprehensive and easy to use set of service offerings when it comes to IoT and they are reasonably priced. In this post, I am going to show how to build the architecture displayed in the diagram below: connect your devices to Azure IoT Hub and then ingest records into Databricks Delta Lake as they stream in using Spark Streaming.

Click through for the instructions.

Comments closed