Press "Enter" to skip to content

Category: Misc Languages

Retrieving Spark Session Config Variables from Microsoft Fabric

Koen Verbeeck gets some settings:

I was trying some stuff out in a notebook on top of a Microsoft Fabric Lakehouse. I was wondering what some of the default values are of the configuration variables, and if there’s an easy way to retrieve them all. Luckily there is. In the code, I’m using Scala because it has a nice GetAll() function.

Click through for an example of how to use this. And bonus points for using Scala instead of Python here.

Comments closed

Building Functions with Spark Connect and .NET

Ed Elliott continues a series on Spark Connect:

I’m pretty much going to leave the code as-is from the previous post but will move things about a bit and add a SparkSession and a DataFrame class. Also, instead of passing the session id and client around i’m going to wrap them in the SparkSession so that we can just pass a single object and also use it to construct the DataFrame so we don’t even have to worry about passing it around.

The first thing is to take all of that gRPC connection stuff and shove in into SparkSession so it is hidden from the callers:

Read on for the end state that Ed is headed toward and how to get closer to that state.

Comments closed

Exploring the gRPC API in Spark Connect with .NET

Ed Elliott continues a series on Spark Connect. First, Ed builds out something DataFrame API-ish:

So there are two goals of this post, the first is to take a look at Apache Arrow and how we can do things like show the output from DataFrame.Show, the second is to start to create objects that look more familiar to us, i.e. the DataFrame API.

If that’s not enough for you, Ed then shows how you can analyze a plan:

In this post we will continue looking at the gRPC API and the AnalyzePlan method which takes a plan and analyzes it. To be honest I expected this to be longer but decided just to do the AnalyzePlan method.

This has been a really fun series so far from Ed, so check these out. The only downside is that the people demand more F#. And by “the people,” I mostly mean that I would love to see F# examples.

Comments closed

Using the Spark Connect GRPC API

Ed Elliott digs into API details:

In the first two posts, we looked at how to run some Spark code, firstly against a local Spark Connect server and then against a Databricks cluster. In this post, we will look more at the actual gRPC API itself, namely ExecutePlan, Config, and AddArtifacts/ArtifactsStatus.

Click through to see how it all works, with plenty of C# code to guide you along the way.

Comments closed

Running Spark Jobs on Databricks with Spark Connect and .NET

Ed Elliott runs a Databricks job:

This post aims to show how we can create a .NET application, deploy it to Databricks, and then run a Databricks job that calls our .NET code, which uses Spark Connect to run a Spark job on the Databricks job cluster to write some data out to Azure storage.

In the previous post, I showed how to use the Range command to create a Spark DataFrame and then save it locally as a parquet file. In this post, we will use the Sql command, which will return a DataFrame or, in our world, a Relation. We will then pass that relation to a WriteOperation command, which will write the results of the Sql out to Azure storage.

The code is available HERE

Read on for the description of how everything works.

Comments closed

Using Spark Connect from .NET

Ed Elliott keeps the hope alive:

Over the past couple of decades working in IT, I have found a particular interest in protocols. When I was learning how MSSQL worked, I spent a while figuring out how to read data from disk via backups rather than via the database server (MS Tape Format, if anyone cared). I spent more time than anyone should learning how to parse TDS (before the [MS-TDS] documentation was a thing)—having my head buried in a set of network traces and a pencil and pen has given me more pleasure than I can tell you.

This intersection of protocols and Spark piqued my interest in using Spark Connect to connect to Spark and run jobs from .NET rather than Python or Scala.

There’s a whole lot more ceremony involved than the Microsoft .NET for Apache Spark project, but read on to see how it all works. Also, I hereby officially chastise Ed for having examples in C# and VB.NET but not the greatest .NET language of them all: F#. Chastisement aside, I appreciate the work Ed put into this to bring Spark Connect to the .NET masses.

Comments closed

What’s New in .NET 8

Thao Nguyen keeps us up to date on .NET:

Microsoft has announced .NET 8 recently. It emphasized the cloud, performance, full-stack Blazor and .NET MAUI as major highlights of the latest edition of the company’s free, cross-platform, open-source developer platform.

Of special importance to enterprises, .NET 8 is a long-term support (LTS), which means it will be supported and patched for three years as opposed to 18 months for a standard term support (STS) release.

Click through for the list, as it’s good to keep at least one eye on the developers lest they run around unopposed.

Comments closed

Exploratory Data Analysis with F# and Plotly

Matt Eland is speaking my language (F#):

One of the most common tasks with data roles is the need to perform exploratory data analysis (EDA).

With EDA a data scientist, data analyst, or other data-oriented programmer can:

  • Understand the value distributions of their data
  • Identify outliers and data anomalies
  • Visualize correlations, trends, and relationships between multiple variables

Exploratory data analysis usually involves:

  1. Loading the data into a DataFrame
  2. Performing descriptive statistics to identify the raw shape of the data
  3. Visualizing variables of interest on their own or with other variables.

In this article I’ll walk you through the process of loading data from a sample dataset into a Microsoft.Data.Analysis DataFrame (the kind featured in ML.NET). Next, we’ll look at the descriptive statistics the DataFrame class provides and then explore the process of creating some simple visualizations with Plotly.NET.

Read on for the scenario and analysis.

Comments closed

Load Balancing across Azure SQL DBs

Jose Manuel Jurado Diaz scales out:

In today’s data-driven landscape, we are presented with numerous alternatives like Elastic Queries, Data Sync, Geo-Replication, ReadScale, etc., for distributing data across multiple databases. However, in this approach, I’d like to explore a slightly different path: creating two separate databases containing data from the years 2021 and 2022, respectively, and querying them simultaneously to fetch results. This method introduces a unique perspective in data distribution — partitioning by database, which could potentially lead to more efficient resource utilization and enhanced performance for each database. While partitioning within a single database is a common practice, this idea ventures into partitioning across databases.

Click through to see what the code looks like for this.

Comments closed

Getting Started with Semantic Kernel in C#

Matt Eland tries out Semantic Kernel:

Generative AI systems use large language models (LLMs) like OpenAI’s GPT 3.5 Turbo (ChatGPT) or GPT-4 to respond to text prompts from the user. But these systems have serious limitations in that they only include information baked into the model at the time of training. Technologies like retrieval augmentation generation (RAG) help overcome this by pulling in additional information.

AI orchestration frameworks make this possible by tying together LLMs and additional sources of information via RAG. Additionally, AI orchestration systems can provide capabilities to generative AI systems, such as inserting records in a database, sending emails, or calling out to external systems.

In this article we’ll look at the high-level capabilities building AI orchestration systems in C# with Semantic Kernel, a rapidly maturing open-source AI orchestration framework.

Click through to see how things work.

Comments closed