Press "Enter" to skip to content

Author: Kevin Feasel

Querying Private Blob Storage Containers with Azure Synapse Analytics

Dennes Torres looks at some private information:

The queries from the previous article were made against the public container in the blob storage. However, if the container is private, you will need to authenticate with the container. In this article, you’ll learn how to query private blob storage with SQL.

NOTE: Be sure that the Azure Synapse Workspace and the storage account with the sample files are set up before following along with this article. You will also need to replace your storage account URL each time that a storage account URL is used in the article.

There are three possible authentication methods, and these methods may have some variation according to the type of storage account and the access configuration. I will not dig into details about storage here and leave that for a future article.

Read on for the three authorization methods and a lot of detail on using SAS tokens (the preferred method) to access this data.

Comments closed

Combining Change Data Capture with Azure Data Factory

Reitse Eskens continues a series on learning Azure Data Factory:

In my last blog, I pulled all the data from my table to my datalake storage. But, when data changes, I don’t want to perform a full load every time. Because it’s a lot of data, it takes time and somewhere down the line I’ll have to separate the changed rows from the identical ones. Instead of doing full loads every night or day or hour, I want to use a delta load. My pipeline should transfer only the new and changed rows. Very recently, Azure SQL DB finally added the option to enable Change Data Capture. This means after a full load, I can get the changed records only. And with changed records, it means the new ones, the updated ones and the deleted ones.

Let’s find out how that works.

Read on for the article and demonstration.

Comments closed

Content Sharing with Power BI

Marc Lelijveld continues a series on going from small-scale to enterprise with Power BI:

Let’s start with the most important feature of the Power BI Service, sharing content! At the same time, this can be one of the most challenging ones. Especially since there are many ways to share content in Power BI. In my experience in enterprise organizations, I have seen a various ways of sharing content. Below I explain the different options there are, leading to a conclusion of my personal best practice.

But why is the way how we share content so important in relation to large enterprise solutions? Well, I believe that all centrally managed solutions should match (organizational) best practices. The way how the content is made available to the users is one of these best practices. It will help end users to find the content they are looking for, always at the same consistent location.

Read on for several techniques.

Comments closed

Data Platform Deployments via Azure Test Plan

Kevin Chant shows off the power of Azure Test Plans:

In this post I want to cover using Azure Test Plans for Data Platform deployments. Because using it to manage test plans can be very useful.

By the end of this post, you will know what Azure Test Plans are and how they can be useful for data Platform deployments.

Click through to see how this feature in Azure DevOps works and how you can use it to test your deployments.

Comments closed

Differences in Logging between Azure Analysis Services and Power BI PPU

Gilbert Quevauvilliers continues a series on migrating from Azure Analysis Services to Power BI Premium Per User:

Another important aspect when having datasets is being able to log and monitor performance. In this blog post I am going to compare the logging between Azure Analysis Services (AAS) and Power BI Premium Per User (PPU).

With the recent release of PPU having integration with Log Analytics it makes it a lot easier to compare the logging options between AAS and PPU.

This is an area where there’s still a bit of a gap. Click through to see what the differences look like today.

Comments closed

Two Ways to Access Kafka Topics from R

Patrick Neff shows us a couple of ways to build a Kafka-to-R pipeline:

In Data Science projects, we distinguish between descriptive analytics and statistical models running in production. Overall, these can be seen as one process. You start with analyzing historical data to gain insights, find correlations, and finally develop and optimize your model. Then you transfer it and use it in your running system. A key point for every data scientist is not just the mathematical skills themselves, but also how to get the data into your analytics program.

In this blog post, we focus exactly on this crucial step: retrieving the data. In a second article, we’ll talk about running your model on real-time data.

Click through for the techniques.

Comments closed

Reinforcement Learning and Python 3

I have a new post up:

I finally got around to trying out a reinforcement learning exercise this weekend in an attempt to learn about the technique. One of the most interesting blog posts I read is Andrej Karpathy’s post on using reinforcement learning to play Pong on the Atari 2600. In it, Andrej uses the Gym package in Python to play the game.

This won’t be a post diving into the details of how reinforcement learning works; Andrej does that far better than I possibly could, so read the post. Instead, the purpose of this post is to provide a minor update to Andrej’s code to switch it from Python 2 to Python 3. In doing this, I went with the most convenient answer over a potentially better solution (e.g., switching xrange() to range() rather then re-working the code), but it does work. I also bumped up the learning rate a little bit to pick up the pace a bit.

Click through for the (slightly) updated code.

Comments closed

String Formatting in Powershell

Robert Cain continues a series on fun with Powershell and strings:

Specifically, this can control the output when we embed a numeric value inside a string. Passing in special formatting instructions will make it easy to display values with commas, as currency, or even as hexidecimal.

For all of the examples, we’ll display the code, then under it the result of our code. In this article I’ll be using PowerShell Core, 7.1.3, and VSCode. The examples should work in PowerShell 5.1 in the PowerShell IDE, although they’ve not been tested there.

Robert has quite a few examples, so check them out.

Comments closed

The Benefits of Kubernetes for App Hosting

Joy George Kunjikkur enumerates reasons why you might want to use Kubernetes to host applications:

I started writing this post 2-3 years back. Mainly when Apache Spark 2.3 started supporting Kubernetes (K8s) in 2018. It was obvious that Kubernetes is taking over app hosting space the same way virtual machines took over physical machines. All are expected to understand where the industry is moving and adopt. Hence I paused this post as there is nothing I need to endorse. But it’s time to resume this post and publish it.

Click through for a slew of thoughts on the topic.

Comments closed

A Primer on Azure Kubernetes Service

Arun Sirpal gives us a brief introduction of Azure Kubernetes Service:

You have the ability to run these on-premises (complex) or in a cloud service, like AWS or Azure. Hence AKS – Azure Kubernetes Service which helps reduce the complexity and operational overhead of managing Kubernetes by offloading much of that responsibility to Microsoft. You may be wondering how does containers relate to this? It was something on my mind when I first entered into this technology. Remember that containers is the next step beyond traditional virtualisation, you can run SQL Server Linux in containers, as an example. I then look at AKS as the “management” layer of the container solution, carrying out tasks such as scheduling, scaling, health, load balancing and host management.

Click through for more information.

Comments closed