Press "Enter" to skip to content

Category: Cloud

Optimizing Kafka Infrastructure Costs

Addison Huddy saves some money:

In this first blog, we’re going to run through the infrastructure costs of running Kafka—i.e., compute, storage, networking, and the additional tooling you need to keep Kafka up and running smoothly. We won’t bury the lede—if you’re running Kafka in the cloud across multiple AZs (as most do for high availability), networking likely represents over 50% of your Kafka infrastructure costs. Let’s see how this ends up being the case.

Click through for some thoughts on how to reduce network costs, using AWS as an example.

Comments closed

Loading WhoIsActive Data on Azure SQL DB

Andrea Allred wants to know who’s doing what on this system:

I needed to collect sp_WhoIsActive into a table, but the twist was that it is on my Azure Managed Database, so I had to get creative with how I did it. We needed an Azure Pipeline to run it, but we wanted to record it every minute and firing a pipeline every minute adds up fast. So we decided that we would kick it off once an hour and have the process wait for a minute and then fire until the hour ended. Then it fire again at the top of the next hour and the same process would happen.

That’s an interesting way to do it. Another alternative might have been an Azure function app, which you could schedule to run every minute. I think that’d be a lot less expensive than running an Azure Pipeline, and this goes to show you that there are many ways to solve the same problem in Azure.

Comments closed

Unrolling Multiple Arrays in Azure Data Factory

Mark Kromer puts us in disarray:

ADF and Synapse data flows gave a Flatten transformation to make it easy to unroll an array as part of your data transformation pipelines. We’ve updated the Flatten transformation to now allow for multiple arrays that can be unrolled in a single transformation step. This will make your ETL jobs much simpler with fewer transformation steps.

Click through for screenshots showing how to use this feature.

Comments closed

Snowflake Data Governance

Enrique Lopez de Lara shares a few ways that Snowflake allows us to protect data in its system:

The role hierarchy in the previous section defines what can be done on different objects and by whom. However, it doesn’t restrict which records within a table a user can see or which values should be masked within a column. That’s where the data governance policies in this section come into play.

All data governance policies and tags are stored in the PROD_DB_GOV database under three schemas: MASKING, ROWACCESS and TAGS. Putting all the policies and tags in a single database allows us to centralize them and better restrict access to them. Please note that only the GOV_ADMIN role has read/write permissions on it.

These are, for the most part, very similar to what we’re used to in relational databases: application and system roles, row-level security, and data classification.

Comments closed

Retrieving Redshift Query History

Koen Verbeeck wants to see what you did last summer:

Because my Windows machine apparently decides to install updates over night (and thus reboot my machine), it has happened that I lost the query that I was writing for Redshift in the tool DBeaver. When you work with SQL Server Management Studio (SSMS), you typically don’t have this issue as a temporary copy is always saved. Close down SSMS, restart it and the queries are still there.

Click through to see what you can do.

Comments closed

Building an Azure Stream Analytics Query

Alex Lin takes us through the process:

As a developer, your journey with Azure Stream Analytics (ASA) can be divided into several stages, each with its own set of challenges and requirements. In this blog post, we’ll walk you through the typical developer journey in ASA, from the initial setup to production deployment. Along the way, we’ll explore the various development tools and best practices that will help you build a Stream Analytics job. 

Click through for the demonstration.

Comments closed

Cost Optimization with Azure SQL Managed Instance

Niko Neugebauer makes a few recommendations:

Optimizing costs in SQL Managed Instance isn’t a one-size-fits-all situation—there are several cost-saving benefits and capabilities to take advantage of depending on your unique business needs. By taking advantage of these benefits, or better yet, using two or more simultaneously, you can achieve significant cost savings while optimizing your business operations.

Click through for the list. None of it is earth-shattering, but they are still things to keep in mind along the way.

Comments closed

Migration Tool for Cosmos DB

Hasan Savran reviews a tool:

Migrating databases to a new database server is a big job. You need to have the right tools to make this process easy for everyone. Cosmos DB had a Data Migration Tool to move data from a bunch of sources. It was a great, free tool with an easy user interface. You didn’t have to be a developer to use this tool. Here is a screenshot of what it used to look like.

     Unfortunately, the tool got old and did not get updated with upcoming SDK changes. It worked only in Windows environments too.

Click through for Hasan’s thoughts. I had a behind-the-scenes look at everything getting put together, in that I was supposed to help, got busy, and slinked away as really sharp people like Carey Payette and John Bowen drove the project to completion.

Comments closed

Checklist for a Snowflake Migration

Sandeep Arora has a checklist for us:

We have broken our Snowflake Migration Checklist into nine phases to help plan and execute an end-to-end migration of the existing traditional data platform to Snowflake. These phases will help align migration resources and efforts; however, this doesn’t necessarily mean that all steps should be executed sequentially. Some phases, like “Train Users,” can be executed parallel to other phases.

At a high level, the process isn’t Snowflake-specific—really, 6 of the 9 steps are generic supporting steps which would apply to any major project. This makes the checklist not only a good starting point for a Snowflake migration, but also any major migration project.

Comments closed