Press "Enter" to skip to content

Category: Cloud

Polygon-Based Spatial Searches with Cosmos DB

Hasan Savran continues a series on spatial data in Cosmos DB:

I want to continue to develop our new map application for Azure Cosmos DB. So far, we can run a custom spatial query in Cosmos DB and display the results on a map. I want my users to create a polygon on map and search for data under this polygon. If you are familiar with Zillow, that is how their application lets you look for houses to buy or rent. You select an area, and Zillow application displays all available houses or rental under the area you draw. It is extremely useful and user-friendly search.

Click through to see how Hasan does it, as well as getting around a coordinate ordering problem.

Comments closed

Accessing Azure Queue Storage from R

Hong Ooi announces a new package for R:

This post is to announce that the AzureQstor package is now on GitHub. AzureQstor provides an R interface to Azure queue storage, building on the facilities provided by AzureStor.

Queue Storage is a service for storing large numbers of messages, for example from automated sensors, that can be accessed remotely via authenticated calls using HTTP or HTTPS. A single queue message can be up to 64 KB in size, and a queue can contain millions of messages, up to the total capacity limit of a storage account. Queue storage is often used to create a backlog of work to process asynchronously.

Hong includes a couple of demos as well, so check it out.

Comments closed

Displaying Cosmos DB Spatial Data with .NET Core

Hasan Savran builds up a quick .NET Core app to retrieve spatial data from Cosmos DB and display it:

Cosmos DB stores geospatial data in GeoJSON format. You can not tell what raw GeoJSON represents because usually all it has is a type and bunch of coordinates. Azure Cosmos DB does not have any UI to help you what GeoJSON data looks like on a map either. Only option you have is a third party tool which might display data on a map or Azure Cosmos DB Jupyter Notebooks.

    I want to run a query in Azure Cosmos DB and see the results on a map. I decided to create a simple UI which displays spatial data on a map. I will show you how to do this step by step. I will use LeafLetJs as a map. It is open source and free! Also, I need to create .NET Core 3.1 web application and use Azure Cosmos DB Emulator for data.

Hasan walks us through the demo and promises to put the code in GitHub later.

Comments closed

Getting Started with MySQL in Azure

Chris Hyde tries out Azure’s MySQL Platform-as-a-Service offering:

I started out by setting up a dedicated resource group to use for my instance, and then used the Azure Portal GUI to create a new instance named mysql-20200505. I made sure to downgrade from the default General Purpose configuration to Basic, so it will only cost me about $67 a month if I leave it running instead of around $350. After the instance was created successfully I then added some connection security rules to ensure that only my IP was able to connect to it.

I then opened up MySql Workbench to connect to the server as pictured below. Of course it took me two tries to connect as I made my usual error of not including the machine name in the username field the first time around.

Click through for Chris’s early tests.

Comments closed

Query Acceleration for Blob Storage and Data Lake Gen2

James Serra takes us through Query Acceleration for Azure Blob Storage and Azure Data Lake Storage Gen2:

Just announced is Query Acceleration for Azure Data Lake Storage Gen2 (ADLS) as well as Blob Storage. This is a new capability for ADLS that enables applications and analytics frameworks to dramatically optimize data processing by retrieving only the data that they require to perform a given operation from storage. This reduces the time and processing power that is required to query stored data.

For example, if an application will execute a SELECT statement that filters columns and rows from a csv file, instead of all pulling the entire csv file over the network into the application and then filtering the data, it will instead do the filtering at the time the data is read from the disk, so that only the filtered data is transferred over the network to the application. So if you have a csv file with 50 columns and 1 million rows, but the filters limit the data to 5 columns and 1000 rows, then only the 5 columns and 1000 rows will be retrieved from the disk and sent over the network to the application.

Click through to learn more, including current libraries which support this and information on the additional cost. I’d really like to see PolyBase support this, as it would alleviate one of the problems with using Blob Storage + PolyBase: the need to pull all of that data down to your SQL Server instance before doing any filtering.

Comments closed

Workload Classification with Resource Governor in Azure Synapse Analytics

Niko Neugebauer keys in on an interesting addition to Azure Synapse Analytics:

Given that we can specify 5 different parameters (USER MEMBERNAME, ROLE MEMBERNAME, WLM_LABEL, WLM_CONTEXT, START_TIME/END_TIME) – there must be a prioritisation mechanism in order to decide which condition gets selected. This mechanism is called Parameter Weighting in Azure Synapse and it assigns the following weight to each of those parameters:
USER = 64
ROLE = 32
WLM_LABEL = 16
WLM_CONTEXT = 8
START_TIME/END_TIME = 4
meaning that if the Workload Classifier fits into the timeframe START_TIME/END_TIME, WLM_LABEL & ROLE – it will receive 52 points = 4 + 16 + 32,
while a different Workload Classifier that fits into WLM_CONTEXT & USER will get 72 points = 8 + 64, thus will prevail and will be selected over the first Workload Classifier.

Azure Synapse Analytics (including when it was known as SQL Data Warehouse) has had some resource governor-related things I’ve wanted in the box product for a while, including labels (which are better than using application name).

Comments closed

Replicating SQL’s IN Operator with Azure Data Factory

Rayis Imayev shows how we can find values in a group using Azure Data Factory:

However only this use-case for the OR function with 2 condition could be possible:or(equals(variables(‘var1’), ‘A’), equals(variables(‘var1’), ‘B’)) – limit of two conditions

But what if we have an ability to check if a particular element variable/parameter/other ADF object value belongs to a range of values (array of value), similarly to what we can do with the IN operator in SQL language, this would definitely solve our problem and remove the limitation of logical conditions to check.

Click through for the answer.

Comments closed

Using Cognitive Services in Power BI without a Premium Subscription

Marc Lelijveld and Kathrin Borchert show how we can take advantage of Cognitive Services and Power BI without having to pay for Power BI Premium:

Recently, I was presenting my session about AI Capabilities for Power BI to make AI Accessible for Everyone for the Virtual Power BI Days Hamburg. A great event organized by Kathrin Borchert. Part of my session was about the Artificial Intelligence capabilities offered as part of Power BI Premium. A day later, Kathrin came up with a great idea how you can leverage these AI capabilities without the need for Power BI Premium.

I was directly enthusiastic about that idea since I thought about this in the past as well. Back then, there were some blockers which are sorted now. I asked Kathrin if she was open for co-authoring this blog and she immediately agreed.

Click through for the technique. Basically, it’s a trade-off between simplicity and cost.

Comments closed

Accessing Managed Instances from SSMS

James Serra shows us what we need to do in order to reach an Azure SQL Managed Instance from SQL Server Management Studio:

It used to be that the only way to use SQL Server Management Studio (SSMS) against Azure SQL Database Managed Instance (SQLMI) was to create a VM on the same VNET as SQLMI and use SSMS on that VM. That VM was usually called a jumpbox (see instructions here).

But about a year ago Microsoft added a way to use SSMS without using a VNET (announcement) by allowing you to enable a public endpoint for your SQLMI. This made it easy for me to access a SQLMI database on my laptop.

That change enables what James shows us.

Comments closed

Azure SQL Database Tiers

Tim Radney enumerates the tiers available to us with Azure SQL Database:

When Azure SQL Database first launched, there was a single pricing option known as “DTUs” or Database Transaction Units. (Andy Mallon, @AMtwo, explains DTUs in “What the heck is a DTU?“) The DTU model provides three tiers of service, basic, standard, and premium. The basic tier provides up to 5 DTUs with standard storage. The standard tier supports from 10 up to 3000 DTUs with standard storage and the premium tier supports 125 up to 4000 DTUs with premium storage, which is orders of magnitude faster than standard storage.

But there have been several additions since then and Tim lays it out for us.

Comments closed