Press "Enter" to skip to content

Category: Cloud

Translation in Power BI via Cognitive Services

Leila Etaati gets lost in translation:

There is a possibility to call cognitive service for translation inside Power Query.

I will use this to translate 3000 rows of data about people arrested in Iran for protesting; This information contains city Name, Fullname and Other statements.

In this article, I will show how to call cognitive services for translation, create a proper JSON call and finally, use it inside Power Query.

Read on for the translation in Power Query, specifically from Farsi to English.

Comments closed

Using Managed Identities with Azure Functions

Dennes Torres takes us through the proper use of managed identities:

Let’s talk about authentication between Azure Functions and resources used by Azure Functions and conclude with many poorly documented secrets about how to use User Assigned Managed Identity. When we build Azure functions, they usually need to authenticate against other Azure resources: Azure SQL DatabaseStorage AccountsService Bus and many more.

Each of these services have an authentication that we can call “Meh!”: Azure SQL has SQL Standard Logins, storage accounts have SAS tokens, service bus, shared access keys and so on. These are not the safest methods possible. If the key leaks, you will have a security problem because anyone with the key will be able to access the resource.

There are multiple solutions for this problem, some of them would pass through Key Vault, used to store secrets, keys and passwords. But let’s go directly to the best one: Remove the usage of keys at all. 

Read on to learn how.

Comments closed

Cosmos DB to Data Explorer Synapse Link

Vicent-Philippe Lauzon makes an announcement:

We recently made our new Kusto data connection available in public preview:  Cosmos DB to Azure Data Explorer Synapse Link.

This does look like a marketing-heavy announcement but the short version is that you can ingest data from Cosmos DB into Data Explorer pools via Synapse Link rather than creating your own ETL process. The previous Cosmos DB connector for Synapse Link tied to a dedicated SQL pool.

Comments closed

Alert Setup with Azure Monitor

Sunil Verma sounds the alarm:

For this instance, we will setup an alert and action to determine and send out a notification when a Virtual machine has been stopped and also could be restarted whenever such conditions has met.

1. Firstly, Go to search pane on the Azure portal search monitor, click on alert inside monitor and create an alert rule. Further, specify a scope for what you want to setup alert. On this occasion, I am setting it for virtual machine.

Read on to learn more about what Azure Monitor does, as well as the steps to set up an alert and an action.

Comments closed

Moving Stack Overflow to Azure

Aaron Bertrand gets into the whats and wherefores:

Like many companies, Stack Overflow is trying to get out of the business of running our architecture in our own data centers; instead, we want to offload some of the more mundane parts of system administration to a cloud service offering like Azure.

I’m going to cut to the chase for the purpose of this article and concede we’ve already decided on Azure for the majority of our infrastructure and, most importantly to me, our databases.

Click through to learn what their plan is and why Aaron & co went that particular route.

Comments closed

Script Activity Outputs to ForEach Inputs with ADF

Meagan Longoria links in a script:

In early 2022, Microsoft released a new activity in Azure Data Factory (ADF) called the Script activity. The Script activity allows you to execute one or more SQL statements and receive zero, one, or multiple result sets as the output. This is an advantage over the stored procedure activity that was already available in ADF, as the stored procedure activity doesn’t support using the result set returned from a query in a downstream activity.

However, when I went to find examples of how to reference those result sets, the documentation was lacking. 

Click through as Meagan corrects a gap in documentation.

Comments closed

Structuring Azure ML Projects and using the Terminal

Tomaz Kastrun nears the end of the Azure ML advent. Day 20 covers package requirements and other niceties:

When creating notebooks, it is always a good way to have the dependencies included. Whether it is a particular version of a package, a separate script file or an installation requirement.

Selecting an environment or kernel can be an issue if it is not correctly initiated with the code. And you can also check the kernels with a simple python code:

Day 21 looks at the Azure CLI and running code from within a compute instance terminal:

Using Azure CLI can help you progress faster, make repetitve tasks automated and even use the GIT integration, for faster and better collaboration.

So we have created a YAML file on Day20 and we can use it also with Azure CLI to create an environment.

Comments closed