Press "Enter" to skip to content

Category: Containers

Using containerd as a Kubernetes Container Runtime

Anthony Nocentino does a switcheroo:

In this post, I’m going to show you how to install containerd as the container runtime in a Kubernetes cluster. I will also cover setting the cgroup driver for containerd to systemd which is the preferred cgroup driver for Kubernetes. In Kubernetes version 1.20 Docker was deprecated and will be removed after 1.22. containerd is a CRI compatible container runtime and is one of the supported options you have as a container runtime in Kubernetes in this post Docker Kubernetes world. I do want to call out that you can use containers created with Docker in containerd.

There are a few steps, but it’s not crazy by any stretch, and that’s part of why the post-Docker-in-Kubernetes world won’t be chaos.

Comments closed

Running an mlflow Server on Azure

Paul Hernandez configures mlflow on Azure using platform-as-a-service offerings:

It is indisputable true that mlflow came to make life a lot easier not only for data scientists but also for data engineers, architects among others. There is a very helpful list of tutorials and example in the official mlflow docs. You can just download it, open a console and start using it locally on your computer. This is the fastest way to getting started. However, as soon as you progress and introduce mlflow in your team, or you want to use it extensively for yourself, some components should be deployed outside your laptop.

To exercise a deployment setup and since I own azure experience, I decided to provision a couple of resources in the cloud to deploy the model registry and store the data produced by the tracking server.

I concur on the power of mlflow.

Comments closed

Running Spark on Azure Kubernetes Service

Tsuyoshi Matsuzaki walks us through running Apache Spark on Azure Kubernetes Service:

Apache Spark officially includes Kubernetes support, and thereby you can run a Spark job on your own Kubernetes cluster. (See here for official document. Note that Kubernetes scheduler is currently experimental.)
Especially in Microsoft Azure, you can easily run Spark on cloud-managed Kubernetes, Azure Kubernetes Service (AKS).

In this post, I’ll show you step-by-step tutorial for running Apache Spark on AKS. In this tutorial, artifacts, such as, source code, data, and container images are all protected by Azure credentials (keys).

Although managed services for Apache Spark, such as, Azure Databricks, Azure Synapse Analytics, and Azure HDInsight, is the best place to run Spark workloads, you will get much flexibility by running workloads on managed Kubernetes (AKS) – such as, spot VM support, start/stop cluster, confidential computing (Intel SGX) support, so on and so forth.

Read on to see how. Though of these options, I’d probably choose Azure Databricks or Azure Synapse Analytics well before the others.

Comments closed

Changing a Kubernetes Cluster to containerd

Andrew Pruski wants to get ahead of the game:

DISCLAIMER – You’d never do this for a production cluster. For those clusters, you’d simply get rid of the existing nodes and bring new ones in on a rolling basis. This blog is just me mucking about with my Raspberry Pi cluster to see if the update can be done in-place without having to rebuild the nodes (as I really didn’t want to have to do that).

Check it out. In addition to the Twitter thread Andrew mentions, the Kubernetes group has a full blog post with more details.

Comments closed

Finding the Physical Path of a SQL Server Backup on a Container

Jack Vamvas is looking for love files in all the wrong places:

I’m migrating some SQL Server databases to Openshift Containers. The SQL Server is set up with persistent disk , with a dedicated persistent disk partition for the SQL Server defaultbackup directory. I don’t have access to the underlying files via command line and can only use command line. How can I get the physical disk device , which will then allow me to create a RESTORE DATABASE statement pointing to the device?

Read on for the answer, including a T-SQL script to find where these files live.

Comments closed

Considerations Before using SQL Server on Containers

Joy George Kunjikkur wants you to slow your roll a little:

It is easy to get started on development and simple testing using SQL containers. It was discussed in the previous post. But before putting into production and start developing real applications we had to make sure the below things at least.

Read on for those considerations. I think they are reasonable and generally agree with the bottom-line conclusion.

Comments closed

Migrating SQL Server Container Images to GitHub

Andrew Pruski has moved some images around:

A couple of months ago Docker announced that they would be implementing a 6 month retention policy for unused images in the Docker Hub.

This was due to kick in on the 1st of November but has now been pushed back until mid 2021.

I’ve had multiple Windows SQL Server container images up on the Docker Hub for years now. It’s been a great platform and I’m very thankful to them for hosting my images.

That being said, I want to make sure that the images that I’ve built are always going to be available for the community so I have pushed my SQL Server images to the Github Container Registry.

I guess I should do the same.

Comments closed