Exposing Multiple Docker Ports

Steve Jones shows how to expose multiple ports when spinning up a container:

I was working with containers recently with Jenkins. I didn’t want the server process running on my machine all the time, but I did need to allow some communication. Jenkins uses 8080 by default, but agents need another port.

I figured there was a way to do this, and I found it on Stack Overflow, which is the perfect forum for a question like this. The answer?

You’ll need to click through for the answer.

Kubernetes on Windows

Elton Stoneman helps us get started with Kubernetes on Windows boxes:

Now you can take older .NET Framework apps and run them in Kubernetes, which is going to help you move them to the cloud and modernize the architecture. You start by running your old monolithic app in a Windows container, then you gradually break features out and run them in .NET Core on Linux containers.

Organizations have been taking that approach with Docker Swarm for a few years now. I cover it in my book Docker on Windows and in my Docker Windows Workshop. It’s a very successful way to do migrations – breaking up monoliths to get the benefits of cloud-native architecture, without a full-on rewrite project.

Now you can do those migrations with Kubernetes. That opens up some interesting new patterns, and the option of running containerized Windows workloads in a managed Kubernetes service in the cloud.

Elton’s not kidding about this support being new. I’m not sure I’d entrust it for my production work just yet, but I’m glad to see people working on the problem.

Deploying a Big Data Cluster

Mohammad Darab takes us through the Big Data Cluster deployment process using Azure Data Studio:

I’ve been “playing around” with Big Data Clusters for some time now and CTP 3.2 is way ahead when it comes to streamlining the BDC deployment process. You can check out my 4-part series on deploying BDC on AKS to see how cumbersome the process used to be. New in CTP 3.2, you can deploy a BDC on AKS (an existing cluster OR a new cluster) using an Azure Data Studio notebook. Let’s see how.

Click through for instructions. It was rather smart of Microsoft to release the instructions as a notebook.

Kafka Docker on Kubernetes

Bill Ward gives us a step-by-step set of instructions for installing Kafka Docker on Kubernetes:

In this ultimate guide I will give you a simple step-by-step tutorial on installing Kafka Docker on Kubernetes. This post includes a complete video walk-through.

There has been a lot of interest lately about deploying Kafka to a Kubernetes cluster. If you are wanting to take the deep dive yourself then you found the right article. Now that we have Kafka Docker, deploying a Kafka cluster to Kubernetes is a snap.

This makes it even easier to get started with Kafka in a development environment.

Spark and dotnet in a Single Container

Ed Elliott shows how you can combine Spark and .NET Core in a single Docker container:

This is quite new syntax in docker and you need at least docker 17.05 (client and daemon), after the images “FROM blah” you can specify a name “core” in this case, then later you can copy from the first image to the second using “–from=” on the “COPY” command.

In this dockerfile I have added Spark 2.4.3 and the default environment variables we need to get spark running, if you grab this dockerfile and run “docker build -t dotnet-spark .” you should get an images you can then run which includes the dependencies for dotnet as well as spark.

Ed includes all of the scripts needed to test this out, too.

Converting Docker Compose Files with Kompose

Andrew Pruski shows how you can convert a Docker compose file to something which Kubernetes can read using Kompose:

This will spin up one container running SQL Server 2019 CTP 3.1, accept the EULA, set the SA password, and set the default location for the database data/log/backup files using named volumes created on the fly.

Let’s convert this using Kompose and deploy to a Kubernetes cluster.

To get started with Kompose first install by following the instructions here. I installed on my Windows 10 laptop so I downloaded the binary and added to my PATH environment variable.

It looks pretty straightforward to use; check it out.

Chaos Engineering and KubeInvaders

Andrew Pruski wants to play a game:

KubeInvaders allows you to play Space Invaders in order to kill pods in Kubernetes and watch new pods be created (this actually might be my favourite github repo of all time).

I demo SQL Server running in Kubernetes a lot so really wanted to get this working in my Azure Kubernetes Service cluster. Here’s how you get this up and running.

I got to see Andrew show it off at SQL Saturday Cork and it was as fun as you’d expect.

Installing SQL Server on Windows Containers

Mohammad Darab shows how you can install SQL Server on a Windows Container:

Today, Microsoft released Windows Containers, currently in the Early Adoption Program here. You can use the same Docker Desktop for Windows but now you can use Windows Containers to deploy SQL Server (as you will see below).

This is primarily for dev/test environments. This is in no way production ready. Think about a developer asking for a SQL instance so that they can run their code against, etc. You can now easily, and quickly, spin up a container running SQL Server. Once they are done, you can remove it or stop it.

The downside is that your Docker Desktop for Windows is now stuck running Windows containers…

Building a Big Data Cluster

Mohammad Darab continues a series on SQL Server Big Data Clusters in Azure Kubernetes Service:

To kick off the Big Data Cluster “Default configuration” creation, we will execute the following Powershell command:

mssqlctl cluster create

That will first prompt us to accept the license terms. Type y and Enter. 

Mohammad takes us through the default installation, which requires only a few parameters before it can go on its merry way.

Feeding Kubernetes Log Data to Logstash and Kibana

Aayushi Johari shows how you can stand up a Kubernetes cluster and review log data using Logstash and Kibana:

In this article, you will learn how to publish Kubernetes cluster events data to Amazon Elastic Search using Fluentd logging agent. The data will then be viewed using Kibana, an open-source visualization tool for Elasticsearch. Amazon ES consists of integrated Kibana integration.

We will walk you through with the following process:
Creating a Kubernetes Cluster
Creating an Amazon ES cluster
Deploy Fluentd logging agent on Kubernetes cluster
Visualize kubernetes date in Kibana

Click through for the full article.

Categories

August 2019
MTWTFSS
« Jul  
 1234
567891011
12131415161718
19202122232425
262728293031