Saturday, April 4, 2020

Setting Up Kafka using Docker Compose



Pre-requisites -
1. Docker
2. Git

Note- This document refers to material presented in  https://docs.confluent.io/current/quickstart/ce-docker-quickstart.html. It only includes the steps required to get started with Kafka development

Steps
1. Clone the Confluent Platform Docker Images GitHub Repository and check out the 5.4.1-post branch. (or whichever is the latest branch)

git clone https://github.com/confluentinc/examples
cd examples
git checkout 5.4.1-post

2. Navigate to cp-all-in-one examples directory.

cd cp-all-in-one/

3. This will have the docker-compose.yml file. Take a backup of file. And remove section and reference related related to connect, ksql-server, ksql-cli, ksql-datagen.  Here is the updated docker-compose.yml file.


4. Navigate to the folder where you have the updated docker-compose.yml

docker-compose up  --> This will download the components and start zookeeper, kafka Broker, schema-registry and control center. Wait for all the components to start successfully.

If the components are already downloaded, 
 docker-compose start --> To start the service 

5. Run the following command to verify the services
docker ps

It will appear as follow


The following url's can be used to access the schema registry and control center
Schema Registry : http://localhost:8081
Control Center : http://localhost:9021

6. To stop the service  --> docker-compose stop








To Stop and remove the service --> docker-compose down



No comments:

Post a Comment

Client Setup for Authentication using SASL - Kerberos

Prerequisite: 1. A kafka cluster setup with Kerberos 2. A keytab file from the administrator Note: This document shows a client implem...