Press "Enter" to skip to content

Category: Misc Languages

pl/dotnet Version 0.99

Brick Abode announces F# and C# support within Postgres:

pl/dotnet adds full support for C# and F# to PostgreSQL. 0.99 is our public beta release; we wish to share its amazingness with the world.

  • We support all PL operations: functions, procedures, DO, SPI, triggers, records, SRF, OUT/INOUT, table functions, etc
  • We natively support 40 out of 46 standard user types, the most of any external PL
  • Fully NPGSQL-compatible, and SPI is exposed through the NPGSQL API for maximum compatibility
  • In our benchmarks, C# and F# are the fastest Procedural Languages in PostrgreSQL
  • All features are fully tested for both C# and F#, with 1013 unit tests
  • 100% free software under the PostgreSQL license

This is a beta release; we invite usage and welcome feedback.

Look at me, side-eyeing SQL Server and how SQLCLR still doesn’t have F# support. I still maintain that the single biggest mistake Microsoft made around SQLCLR was adopting the “safe” and “unsafe” mode language. C# developers understood that “unsafe” meant you could get access to pointers and other internals that .NET languages typically hide from us. But try explaining that to a DBA, who doesn’t understand the language or the concepts.

On the bright side, .NET languages are the fastest procedural languages for Postgres, so that’s pretty neat. H/T Sergey Tihon.

Leave a Comment

Converting Cursors to PL/pgSQL

Deepak Mahto explains a difference in cursors from Oracle:

In the blog, we will cover scenarios with cursors that differ from how Oracle handles them. During conversion, our initial approach is to match all codebases as closely as possible with the target compatibility. However, in some cases, although the code appears identical, the functionality might vary. Let’s explore one such case here.

Click through for the scenario.

Leave a Comment

Querying a Database from Rust

Ukeje Goodness writes a query:

You’ll need to download and install Rust and your preferred SQL database management system to interact with SQL databases through Rust. Diesel. In this tutorial, we will use SQLite as the DBMS, but you can use your preferred relational database

After you’ve installed Rust and a preferred SQL DBMS that Diesel supports, you can proceed to create a new Rust project with Cargo’s init command:

Doing a separate search, it does look like you can execute stored procedures as well, using either the sql_query() function (when there is a result set you expect back) or execute() (when there isn’t).

Leave a Comment

Infrastructure as Code in GitHub

I have a new video:

In this video, we look at how to perform Infrastructure as Code in GitHub. We take a Bicep script and generate new Azure resources using it and GitHub Actions.

The video includes a very brief primer on Azure Resource Manager (ARM) and Bicep, and then gets into how you can use GitHub Actions to keep your Azure resources configured the way you expect.

Leave a Comment

Analyzing Aircraft Routes

Mark Litwintschik performs a deep dive into aircraft telemetry:

In November, I wrote a post on analysing aircraft position telemetry with adsb.lol. At the time, I didn’t have a clear way to turn a series of potentially thousands of position points for any one aircraft into a list of flight path trajectories and airport stop-offs.

Since then, I’ve been downloading adsb.lol’s daily feed and in this post, I’ll examine the flight routes taken by AirBaltic’s YL-AAX Airbus A220-300 aircraft throughout February. I flew on this aircraft on February 24th between Dubai and Riga. I used my memory and notes of the five-hour take-off delay to help validate the enriched dataset in this post.

Click through for Mark’s analysis using DuckDB and Python.

Comments closed

Retrieving Spark Session Config Variables from Microsoft Fabric

Koen Verbeeck gets some settings:

I was trying some stuff out in a notebook on top of a Microsoft Fabric Lakehouse. I was wondering what some of the default values are of the configuration variables, and if there’s an easy way to retrieve them all. Luckily there is. In the code, I’m using Scala because it has a nice GetAll() function.

Click through for an example of how to use this. And bonus points for using Scala instead of Python here.

Comments closed

Building Functions with Spark Connect and .NET

Ed Elliott continues a series on Spark Connect:

I’m pretty much going to leave the code as-is from the previous post but will move things about a bit and add a SparkSession and a DataFrame class. Also, instead of passing the session id and client around i’m going to wrap them in the SparkSession so that we can just pass a single object and also use it to construct the DataFrame so we don’t even have to worry about passing it around.

The first thing is to take all of that gRPC connection stuff and shove in into SparkSession so it is hidden from the callers:

Read on for the end state that Ed is headed toward and how to get closer to that state.

Comments closed

Exploring the gRPC API in Spark Connect with .NET

Ed Elliott continues a series on Spark Connect. First, Ed builds out something DataFrame API-ish:

So there are two goals of this post, the first is to take a look at Apache Arrow and how we can do things like show the output from DataFrame.Show, the second is to start to create objects that look more familiar to us, i.e. the DataFrame API.

If that’s not enough for you, Ed then shows how you can analyze a plan:

In this post we will continue looking at the gRPC API and the AnalyzePlan method which takes a plan and analyzes it. To be honest I expected this to be longer but decided just to do the AnalyzePlan method.

This has been a really fun series so far from Ed, so check these out. The only downside is that the people demand more F#. And by “the people,” I mostly mean that I would love to see F# examples.

Comments closed

Using the Spark Connect GRPC API

Ed Elliott digs into API details:

In the first two posts, we looked at how to run some Spark code, firstly against a local Spark Connect server and then against a Databricks cluster. In this post, we will look more at the actual gRPC API itself, namely ExecutePlan, Config, and AddArtifacts/ArtifactsStatus.

Click through to see how it all works, with plenty of C# code to guide you along the way.

Comments closed

Running Spark Jobs on Databricks with Spark Connect and .NET

Ed Elliott runs a Databricks job:

This post aims to show how we can create a .NET application, deploy it to Databricks, and then run a Databricks job that calls our .NET code, which uses Spark Connect to run a Spark job on the Databricks job cluster to write some data out to Azure storage.

In the previous post, I showed how to use the Range command to create a Spark DataFrame and then save it locally as a parquet file. In this post, we will use the Sql command, which will return a DataFrame or, in our world, a Relation. We will then pass that relation to a WriteOperation command, which will write the results of the Sql out to Azure storage.

The code is available HERE

Read on for the description of how everything works.

Comments closed