Press "Enter" to skip to content

Category: Cloud

Time Series Insights in Azure

Aveek Das explains the notion of Azure Time Series Insights:

In this article, we are going to learn in detail about Azure Time Series Insights. Microsoft Azure is one of the leading cloud providers these days. With a lot of companies adopting or migrating to the cloud these days, it has become a usual trend to convert existing technologies into cloud-based services and consume them. This not only helps the companies to reduce their cost but also in turn allows them to focus on more business-related problems rather than concentrating on infrastructure costs.

Azure Time Series Insights is one of the cloud services that users can use to integrate with their data that is constantly changing with time such as data from various sensors or machines, data from satellites, airlines etc. Any data that can be generated on a high scale and needs to be analysed, can be used through Azure Time Series Insights. In this article, we will focus on a high-level introduction of this service along with some use cases in detail.

Read on for the article.

Comments closed

Defect Detection with AWS Lookout and Sagemaker

Matthew Rhodes, et al, take us through an interesting case study:

According to a recent study, defective products cost industries over $2 billion from 2012–2017. Defect detection within manufacturing is an important business use case, especially in high-value product industries like the automotive industry. This allows for early diagnosis of anomalies to improve production line efficacy and product quality, and saves capital costs. Although advanced anomaly detection systems employ sensors as well as Internet of Things (IoT) devices to collect multimodal data to improve performance, computer vision continues to be a common approach. Detecting anomalies in automotive parts and components using computer vision can be done using normal images, and even X-Ray based images for structural damages. Recent advances in deep learning and computer vision have allowed scientists and manufacturers to develop enhanced anomaly detection systems, including surface defect detection on automotive body panels and dent detection in vehicles.

Read on for case notes.

Comments closed

Automating Semantic Versioning with Azure DevOps

Dave Ruijter shows how you can use Azure DevOps to perform automatic semantic versioning:

I am a fan of using semantic versioning (a.k.a. SemVer) for data solutions, following the v1.0.0 pattern. It helps in the communication between team members and stakeholders, by limiting ambiguity and misunderstandings related to the version of your solution’s releases. With semantic versioning, the trick is to increment the version according to the changes you have made since the latest release. Manually keeping track of that is not an easy task, especially for small teams, without the capacity to have somebody dedicated to this administration task. I found a way to make this a lot easier, leaning on the Pull Request description! And as a bonus, we will create some nice release notes automatically

Click through to see what you need to have set up on your Azure DevOps subscription and a detailed walkthrough of how to set it up.

Comments closed

Switching Connections from AAS to Power BI

Marc Lelijveld wants to swap a connection from using Azure Analysis Services to Power BI Premium:

Having the context of an Azure Analysis Services dataset that is migrated to Power BI Premium, you might have to rebind many reports. Especially if this dataset is positioned as being a managed dataset that is also used for self-service purposes and has many related reports.

In this blog I will elaborate on how you can easily rebind all these reports to the new Power BI dataset, without downloading all reports and manual rebinding.

It’s not a trivial operation, but it is a lot easier than updating each entry individually.

Comments closed

Using Powershell in Azure Cloud Shell

Hope Foley shows how you can set up Powershell to be your Azure Cloud Shell language of choice:

Part of my job is doing POCs with customers to help with Azure Data Services.  Anything that helps me move quicker is helpful so I’m a tad bit obsessed with automating things.  I have used PowerShell for more years than I’m willing to admit to help me automate what I can.  There are a lot of ways to automate things like ARM templates and DevOps, but PoSH has been my preferred hammer.  As much as I love it, I’ve ran into issue sometimes with installing modules locally on folks machines and not to mention if they have a Mac.  I wondered recently if Azure Cloud Shell would help make things easier, and it very much did and I’m super pumped to share!  This post will help run through how to get setup to run PowerShell scripts in Azure.

For people who prefer Powershell to bash, check it out.

Comments closed

Using Azure Queue Storage as a Trigger for Function Apps

Aveek Das shows how you can use Azure Queue Storage as a way to trigger an Azure Function App:

In this article, we are going to learn how to trigger Function Apps from Queue Storage in Azure. Function Apps has been one of the most popular cloud services of Microsoft Azure. Function Apps allow users to write code in any language and then execute the code in the cloud. There is no infrastructure to be managed and hence is very flexible for writing and building applications on the go. Every Function App can be triggered in multiple ways, for example, by calling the function URL using an HTTP endpoint or from some other functions in Azure. In this article, we are going to trigger the Function App from Queue Storage in Azure and see how to pass a message from the queue to the Function App.

Queue Storage in Azure is another service in Azure that allows users to store multiple messages in it. Users can use a queue to create a list of items that need to be processed one by one. Messages to Queue Storage in Azure can be added by using the HTTP or HTTPS endpoints. Usually, a queue can store data up to 64 KB in size. We can add millions of messages in a queue if it is supported by the storage account.

Click through to see how. Though now I wonder why I might use Queue Storage instead of an Event Hub or an Event Grid. But I suppose that’s a question for a different article.

Comments closed

From Azure Data Factory to Synapse Pipelines

Kevin Chant copies and pastes:

In this post I want to share an alternative way to copy an Azure Data Factory pipeline to Synapse Studio. Because I think it can be useful.

For those who are not aware, Synapse Studio is the frontend that comes with Azure Synapse Analytics. You can find out more about it in another post I did, which was a five minute crash course about Synapse Studio.

By the end of this post, you will know one way to copy objects used for an Azure Data factory pipeline to Synapse Studio. Which works as long as both are configured to use Git.

Click through to see how.

Comments closed

Automating Notebook Execution with Powershell

Julie Koesmarno shows off an automation process for notebooks:

When I first think about automation, I generally think in the following way: in order to automate a script, we want to ensure that the script itself can be run via a command line interface (CLI) and with almost no user interaction (except for input and output parameters). Now, how do we apply this to Jupyter Notebooks so that we can automate SQL notebooks or PowerShell Notebooks?

The good news is that these SQL notebooks and PowerShell notebooks that we’ve created using Azure Data Studio, can be run on PowerShell CLI. If these notebooks can be run on PowerShell CLI, that means any automation systems or serverless architecture (Azure Automation combined with Azure Logic Apps as an example) should be able to run these notebooks also.

In this blog post, I’ll cover examples on using Invoke-SqlNotebook, using Invoke-ExecuteNotebook and putting it together with Azure Automation.

Click through to see the whole thing.

Comments closed