Press "Enter" to skip to content

Category: Cloud

Using Logic Apps to Send Multiple Attachments

Rayis Imayev has a project:

In my real project, I need to build a Logic App to send email messages with a set of files attached from my Azure Storage Account. I was able to find similar examples from other power platform developers, however, they lacked a critical part that I needed: my set of files had to be dynamic: 2 files, or 102 files –  the Logic App should be able to support this.

So, here, I would like to share my brief journey in creating such Azure Logic App:

Read on to see how Rayis solved this.

Comments closed

Cost Management Updates in Azure

Michael Flanakin gives us a few updates on Azure billing:

Understanding your cost patterns over time and investigating specific charges often requires drilling into and selecting specific dates. You’ve always been able to select from one day up to one year in cost analysis, but you’ve told us that selecting those dates isn’t as easy as it could be. As we started building out a new platform for analytics and insights, we took this feedback to heart and completely redesigned the date selection. What you see today is an early peek at that.

This month, you’ll find a new option to select a custom date range in the cost analysis preview. You can pick a single month, a range of months, or start and end dates for a range of days, making it easier than ever to fine-tune your reporting to the dates you need. 

The virtue and downfall of cloud systems like AWS and Azure is that they’re very clear about how much things cost, but only if you know exactly the resources something uses. It’s not as simple as “I want to use a database,” but there are all of those other charges around data egress, networking, log management, etc. which can add up. Many of those costs are negligible (fortunately), but try walking through a pricing scenario for Azure Synapse Analytics sometime with someone new to the product and figure out at what point that person gives up trying to calculate the cost. My money says right around the time you get to the integration runtime costs.

Comments closed

Managing Azure DevOps via Azure Logic Apps

Stuart Ainsworth has a process:

A big part of my job these days is looking for opportunities to improve workflow. Automation of software is great, but identifying areas to speed up human processes can be incredibly beneficial to value delivery to customers. Here’s the situation I recently figured out how to do:

1. My SRE team uses a different Azure DevOps project than our development team. This protects the “separation of duties” concept that auditors love, while still letting us transfer items back and forth.
2. The two projects are in the same organization.
3. The two projects use different templates, with different required fields.
4. Our workflow process requires two phases of triage for bugs in the wild: a technical phase (provided by my team), and a business prioritization (provided by our Business Analyst).
5. Moving a card between projects is simple, but there were several manual changes that had to be made:
– Assigning to a Business Analyst (BA)
– Changing the status to Proposed from Active
– Changing the Iteration and Area
– Moving the card.

To automate this, I decided to use Azure Logic Apps

Read on to see how Stuart did this.

Comments closed

Storing dbatools as a Package in Azure DevOps

Kevin Chant has a process for us:

In this post I want to cover how you can store dbatools PowerShell module as a package in Azure DevOps. By using the Azure Artifacts service.

I want share some knowledge about this because did a demo of it at Malta Data Saturday. By the end of this post you will have a better understanding of Azure Artifacts and a workaround if you encounter a problem publishing a package.

Read on for the process.

Comments closed

Live Extended Events Data with Azure SQL Database

Grant Fritchey is doing it live:

Once you’ve created an Extended Events Session that is output to Azure Storage, you’ve done most of the work. The trick is really simple. Get the Azure Storage account set up with a Container. Create a Shared Access Signature (SAS) with the right permissions (rwl, read, write, list). Get the token from the SAS (it’s a long string). Use it, along with the path to the container to create a Database Scoped Credential. Create the session using the same path and container that you defined in the Credential. Done. You’ve got an Azure Extended Events session gathering data for you and outputting to a file in Azure Storage.

Now, what I’d like to tell you is that you can open up the Live Data window from SSMS. You can’t.

Grant does give us a workaround which kind of does the trick, but this is an obvious place where some additional developer care would be valuable.

Comments closed

Shrinking an Azure SQL Database

Joey D’Antoni wants to take it down a notch:

You will note that I didn’t mention that “your log file grew because of a large index rebuild”. That’s because that is probably roughly (this is a really rough rule of thumb) how big your transaction log needs to be. But, anyway, we’re talking about Azure SQL Database, so you don’t need to worry about your transaction log file. Microsoft takes care of that for you: ‘Unlike data files, Azure SQL Database automatically shrinks log files since that operation does not impact database performance.’

Read on for the twist at the end.

Comments closed

Installing Kubernetes on EC2

Praveen Sripati eschews EKS:

There are tons of ways of setting up K8S on AWS. Today we will see one of the easiest way to get started with K8S on AWS. The good thing is that we would be using t2.micro instance type, which falls under the AWS free tier. This configuration is good enough to get started with K8S and not for production setup. It’s with the assumption that the reader is familiar with the basic concepts of AWS.

Click through for the process.

Comments closed

EMR Studio Now Generally Availabile

Shuang Li announces that Amazon EMR Studio is now in GA:

EMR Studio provides fully managed Jupyter notebooks, and tools like Spark UI and YARN Timeline Service to simplify debugging. EMR Studio uses AWS Single Sign-On and allows you to log in directly with your corporate credentials without signing in to the AWS Management Console. You can install custom kernels and libraries, collaborate with peers using code repositories such as GitHub and Bitbucket, and run parameterized notebooks as part of scheduled workflows using orchestration services like Apache Airflow and Amazon Managed Workflows for Apache Airflow (Amazon MWAA).

With EMR Studio, you can run notebook code on Amazon EMR on Amazon Elastic Compute Cloud (Amazon EC2) or Amazon EMR on Amazon Elastic Kubernetes Service (Amazon EKS), and take advantage of the performance-optimized EMR runtime for Apache Spark. You can set up EMR Studio to run applications on existing EMR clusters or create new clusters using Cloud Formation templates for Amazon EMR.

Click through for more information.

Comments closed

JSON Patching in Cosmos DB

Hasan Savran does some JSON modification:

JSON is a quite common format for data in these days. NoSQL databases save data in JSON format. Even Relational Databases like SQL Server give you option to save or retrieve data in JSON format. When we need to update a property in a JSON document, we need to update the whole JSON document. This can be a problem specially if document size is larger. This is where JSON Patch feature comes in. JSON Patch is a format that let you update JSON document partially. 

     You can do more than updating a property. You can add, remove, replace, move, or test a value by using this format. Let’s look at its details by using the following JSON document as the source.

Read on to see how patching works with Cosmos DB in particular.

Comments closed

Extended Events in Azure SQL Database

Grant Fritchey compares and contrasts extended events on-premises to extended events in Azure SQL Database:

I have long advocated for the use of Extended Events. I’ve been posting all sorts of blog posts on how to implement them, how they present unique opportunities for new and interesting data, and how they do so much more than the old trace events, yet, put less of a load on the system. All of that is true, until we hit Azure SQL Database.

Now, don’t get me wrong, Extended Events are still awesome, amazing and wonderful. It’s just that, Azure SQL Database is going to force us to hop through a few hoops. I want to be up front with these… I’m trying to find the right word here, challenges? Maybe. Frustrations? Yeah, kind of. Limitations? Again, sort of, but not quite. We’ll settle on as neutral a term as possible: differences. For the moment.

Read on for the first part in an ongoing series.

Comments closed