Apache

How to Install Apache Kafka (Single Node ) on Ubuntu

Apache Kafka is a distributed streaming platform capable of handling trillions of events a day. Initially conceived as a messaging queue, Kafka is based on an abstraction of a distributed commit log. Since being created and open sourced by LinkedIn in 2011, Kafka has quickly evolved from messaging queue to a full-fledged streaming platform.

What is Kafka?

Kafka is one of those systems that is very simple to describe at a high level, but has an incredible depth of technical detail when you dig deeper. The Kafka documentation does an excellent job of explaining the many design and implementation subtleties in the system, so we will not attempt to explain them all here. In summary, Kafka is a distributed publish-subscribe messaging system that is designed to be fast, scalable, and durable.

In this article I will describe how to install Apache Kafka on Ubuntu.

Follow the below steps to install Apache Kafka.

Steps #1 : Install Java

To run Apache Kafka you will need to install java. You must have java installed on your system.
Follow the below command to install default OpenJDK on your system from the official PPA’s.

$ sudo apt update
$ sudo apt install default-jdk

Step #2 : Download Apache Kafka

Now download (https://kafka.apache.org/downloads) the Apache Kofka binary files from its official website.

$ sudo wget http://www-us.apache.org/dist/kafka/1.0.1/kafka_2.12-1.0.1.tgz

After download extract the archive file.

$ tar xzf kafka_2.12-1.0.1.tgz
$ mv kafka_2.12-1.0.1 /usr/local/kafka

Step #3: Start Kafka Server

Kafka uses ZooKeeper, so first, start a ZooKeeper server on your system. You can use the script available with Kafka to get start single-node ZooKeeper instance.

$ cd /usr/local/kafka
$ bin/zookeeper-server-start.sh config/zookeeper.properties

Now start the Kafka server.

$ bin/kafka-server-start.sh config/server.properties
...
[2018-02-13 10:47:45,989] INFO Kafka version : 1.0.1 (org.apache.kafka.common.utils.AppInfoParser)
[2018-02-13 10:47:45,995] INFO Kafka commitId : c0518aa65f25317e (org.apache.kafka.common.utils.AppInfoParser)
[2018-02-13 10:47:46,006] INFO [KafkaServer id=0] started (kafka.server.KafkaServer)

Step #4: Create a Topic in Kofka

Now create a topic called “myTopic” with a single partition and only one replica.

$ bin/kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic myTopic

Created topic "myTopic".

The replication-factor describes how many copies of data will be created. As we are running with single instance keep this value 1.

Set the partitions options as the number of brokers you want your data to be split between. As we are running with single broker keep this value 1.

Run the below command to see the created topic on Kafka.

$ bin/kafka-topics.sh --list --zookeeper localhost:2181

myTopic

Instead of manually creating topics you can also configure your brokers to auto-create topics when a non-existent topic is published to.

Step #5: Send Message to Kafka

The “producer” is the process responsible for put data into our Kafka. The Kafka comes with a command line client that will take input from a file or from standard input and send it out as messages to the Kafka cluster. The default Kafka send each line as a separate message.

Now run the producer and then type a few messages into the console to send to the server.

$ bin/kafka-console-producer.sh --broker-list localhost:9092 --topic myTopic

>Welcome to kafka
>This is my first topic
>

You can exit this command or keep this terminal running for further testing. Now open a new terminal to the Kafka consumer process on next step.

Step #6 : Using Kafka Consumer

You can use Kafka command line consumer to read data from kafka cluster and display message to standard output.

$ bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic myTopic --from-beginning

Welcome to kafka
This is my first topic

Now, If you have still running Kafka producer (Step #5) in another terminal. Just type some text on that producer terminal. it will immediately visible on consumer terminal. See the below screenshot of Kafka producer and consumer in working.

Thank you! for visiting LookLinux.

If you find this tutorial helpful please share with your friends to keep it alive. For more helpful topic browse my website www.looklinux.com. To become an author at LookLinux Submit Article. Stay connected to Facebook.

About the author

mm

Santosh Prasad

Hi! I'm Santosh and I'm here to post some cool article for you. If you have any query and suggestion please comment in comment section.

Leave a Comment