Exposing Kafka on two different networks

Exposing Kafka on two different networks

This article was implemented using CDH 5.7.1 with Kafka 2.0.1.5 installed using parcels.

One of the clusters we are working on has the following network configuration:

  • A “data” network exposing our edge, kafka and master nodes to the outside world
  • An “internal” network dedicated to the cluster for our worker nodes

We use Kafka for data ingestion and also to send processed data to another system exposing UIs for the analysts so we have:

  • A Spark Streaming job consuming Kafka topics from YARN (our “internal” network)
  • The other system’s app consuming Kafka topics from the outside (our “data” network)

Thus, Kafka must be available on two different networks. To do so, the following configuration must be applied on each Kafka broker in the kafka.properties safety valve input and the Kafka nodes must share the same hostname on both networks:

Note, Kafka is listening on every interface instead of just the one you need. Supposedly, Kafka accepts the following configuration to set specific IP adresses:

however it will throws this exception on startup:

and a variation of “Each listener must have a different protocol” when changing the ports.

That’s it !

By | 2017-10-24T12:13:22+00:00 July 22nd, 2017|Categories: Blog|Tags: , |0 Comments

About the Author:

Big Data consultant @ Adaltas since 2015, Cesar enjoys discovering stuff and experimenting with new technologies in addition to his day to day work

Leave A Comment

Time limit is exhausted. Please reload the CAPTCHA.