Connecting To Kafka Via SSL

Harikiran Nayak shows how to work with secure Kafka connections:

First, the Kafka broker must be configured to accept client connections over SSL. Please refer to the Apache Kafka Documentation to configure your broker. If your Kafka cluster is already SSL-enabled, you can look up the port number in your Kafka broker configuration file (or the broker logs). Look for the listeners=SSL://host.name:port configuration option. To ensure that the Kafka broker is correctly configured to accept SSL connections, run the following command from the same host that you are running SDC on. If SDC is running from within a docker container, log in to that docker container and run the command.

Read on for more.

Related Posts

Faster User-Defined Functions In SparkR

Liang Zhang and Hossein Falaki note a major performance improvement for functions in SparkR using the latest version of the Databricks Runtime: SparkR offers four APIs that run a user-defined function in R to a SparkDataFrame dapply() dapplyCollect() gapply() gapplyCollect() dapply() allows you to run an R function on each partition of the SparkDataFrame and returns […]

Read More

Database Ownership Chaining On Azure SQL Managed Instances

Jovan Popovic shows that you can enable database ownership chaining on Azure SQL Managed Instances: If you have the same owner on several objects in several databases, and you have some stored procedure that access these objects, you don’t need to GRANT access permission to every object that the procedure needs to access. If the […]

Read More

Categories

September 2017
MTWTFSS
« Aug Oct »
 123
45678910
11121314151617
18192021222324
252627282930