Press "Enter" to skip to content

Category: Cloud

Always Encrypted and Key Vault

Rod Edwards starts a series on Always Encrypted:

For the purposes of this post, i’m going to show how to set up Always Encrypted for column data held in an OnPrem SQL Server (which could be MI, or Azure SQL DB ) using keys stored in Azure Key Vault. And showing you some of the many potential issues that you need to be aware of along the way.

I’m not going to talk about the Secure Enclaves flavour here, just the ‘common or garden’ Always Encrypted.

Read on to learn more.

Comments closed

Speech to Text with Streamlit and Azure AI

I have a new video:

In this video, I show how we can integrate with the Azure AI Services Speech service, using two different methods to capture speech from the microphone via our Streamlit application and submit that to Azure OpenAI.

Check out the video and final set of code. There’s an intermediate set of code for detecting a single utterance. But I think the final product works out pretty well.

Comments closed

Performance Tuning via Query History in Snowflake

Kevin Wilkie gets down to tuning:

In our last post, we talked about some of my favorite queries I use in Snowflake to see various items of interest – such as finding the worst-performing queries. For today’s post, though, I want to talk about performance tuning.

Yes, you read that right. We’re going to use query history to do some fun performance tuning.

Click through for two queries that can help you find what you may need to tune.

Comments closed

PowerShell Script to Move Azure SQL DB from General Purpose to Business Critical

Sakshi Gupta shares a script:

Recently, we faced a requirement to upgrade large number of Azure SQL databases from general-purpose to business-critical.

As you’re aware, this scaling-up operation can be executed via PowerShell, CLI, or the Azure portal and follow the guidance mentioned here – Failover groups overview & best practices – Azure SQL Managed Instance | Microsoft Learn

Given the need to perform this task across a large number of databases, individually running commands for each server is not practical. Hence, I have created a PowerShell script to facilitate such extensive migrations.

Click through for the scenarios Sakshi tested, some important considerations, and the script itself.

Comments closed

Building a SQL Server Failover Cluster in Google Cloud Platform

Dave Bermingham talks about availability:

Building a SQL Server Failover Cluster in Google Cloud Platform (GCP) is a powerful way to ensure your databases remain highly available, even in the face of unexpected failures. High Availability (HA) is crucial for any business-critical application. Downtime can mean lost revenue, decreased productivity, and even damage to your company’s reputation. However, creating HA clusters in the cloud, especially in GCP, presents unique challenges—most notably, the lack of shared storage, which has traditionally been a key component of SQL Server Failover Clustering.

Click through for an overview and a whitepaper on the topic.

Comments closed

Choosing Compute Options for Databricks

Matt Collins shares some recommendations:

You can use Databricks for a vast range of applications these days. From handling streaming datasets, running Deep Learning models to populating data model fact tables with complex transformations, choosing the correct compute option can seem a lot like a stab in the dark followed by (potentially expensive) trial end error.

You can choose from an incredible range of configurations in the Databricks Compute User Interface.

This variety is comprised of the Virtual Machine (VM) category, size, availability and access mode (also referred to as family).

Determining the right compute choice for you could be answered by the classic answer “it depends”, but this guide aim to inform the decision-making process — both in terms of cost and performance.

Read on for more information.

Comments closed

Azure Data Explorer Web UI Updates for July 2023

Michal Bar has an update for us:

Welcome to the July 2024 update. We are excited to announce new features and improvements in ADX web UI.

Continue reading to learn more about:

  1. Copy query with syntax coloring and KQL IntelliSense improvements
  2. Ad-hoc visual data exploration
  3. Dashboards real time refresh rate
  4. Enhanced data interaction for dashboard tiles
  5. Resize and move dashboard tiles using keyboard only

Click through to continue reading.

Comments closed

Transferring CLR Binaries from SQL Server to Azure SQL Managed Instance

Mihailo Joksimovic shows a use case for Azure SQL Managed Instance link:

Azure SQL Managed Instance link is a new feature enabling you to create a distributed availability group between your SQL Server and Azure SQL Managed Instance. It makes it super simple to connect on-prem. and cloud, providing near real-time replication speeds. Benefits are many and if you aren’t familiar, I’d strongly suggest you read our official blog post about it.

One great benefit for our use-case is that MI Link takes care of transferring the CLR assemblies for you! You can import your assemblies on your SQL Server instance, using all too familiar syntax (i.e. CREATE ASSEMBLY FROM ‘C:\path\to\assembly.dll’), and MI link will ensure that those same assemblies get transferred to cloud. Easy peasy and no need to deal with hex literals anymore.

Read on to learn more about this capability.

Comments closed

A Reference Architecture for Microsoft Fabric

James Serra draws boxes:

Microsoft Fabric uses a data lakehouse architecture, which means it does not use a relational data warehouse (with its relational engine and relational storage) and instead uses only a data lake to store data. Data is stored in Delta lake format so that the data lake acquires relational data warehouse-like features (check out my book that goes into much detail on this, or my video). Here is what a typical architecture looks like when using Fabric (click here for the .vsd):

Click through for the image as well as James’s explanation of the components.

Comments closed

Backing SQL Server up to S3 Buckets

David Fowler backs up a database:

Way back in the mists of time I wrote a post on how to backup SQL server to an S3 bucket using TNTDrive, https://sqlundercover.com/2018/06/18/backup-your-on-premise-sql-server-directly-to-an-aws-s3-bucket/.

Back then, if we wanted to backup SQL to S3 we needed to use a third party tool. Since SQL 2022 things have changed and we’ve now got the option to backup directly to S3 in a similar way that we can backup to Azure BLOB store.

And, going one step further, you can also use PolyBase to read data from S3 buckets in SQL Server 2022.

Comments closed