Kafka 2.7 in Docker and Spring Boot | Let’s develop a pub-sub application

Utkarsh Sharma
4 min readNov 11, 2021

In this article, I will show you how you can create a very basic pub-sub application with Kafka and Spring Boot. You will not have to install Kafka as we are going to use docker image for running our Kafka and Zookeeper instances.

Download the code from https://github.com/codeWithUtkarsh/kafka-sample-with-docker for reference.

Pre-requisite for this exercise

Docker installed into your local
Basic understanding of Docker commands
Understanding of creating an application using Java, Spring Boot
Basic understanding of Pub-Sub Model and Apache Kafka

Start with running your Kafka and Zookeeper instances

  1. Create a docker-compose.yml file

2. Run the above file using docker-compose command

docker-compose up -d

3. Test your Kafka connection using Offset Explorer

Download from below URL
https://www.kafkatool.com/download.html

Add Bootstrap Server in the application and give Cluster name

Once your instances are working we can start creating the Spring Boot application for publishing and consuming message events.

Let’s create the application

Create a new maven project in the IDE that you use. Post creation of the project follow these steps:-

  1. Add required dependency in pom.xml

We have added spring-kafka for Kafka packages, spring-boot-starter-web for spring packages supporting web application development, springdoc-openapi-ui for API documentation, spring-boot-starter-test in case we want to write and run test cases.

2. Annotate the main class with @EnableKafka

3. Add properties in application.properties

4. Write Producer/Sender that will produce message events

In the above code, I am using KafkaTemplate object from Kafka library to send the message. Here notice that I have sent the same message against two different topics (payment and cart).

5. Write Consumer/Receiver that will consume/listen to message events

Notice that to listen to any topic you can use @KafkaListener . The attribute it takes is the topic name and the groupId. There are other attributes like topicPattern etc that you can leverage.

I created two consumers for two types of topics and also stoping them in two different lists.

com.codeWithUtkarsh.repo.PaymentMessageRepository

As you can see in PaymentMessageRepository, it stores the message as it is in a list.

com.codeWithUtkarsh.repo.CartMessageRepository

In CartMessageRepository, we are saving the size of the message to mimic some processing in the real world.

6. Let’s write Controller now for the demo.

com.codeWithUtkarsh.rest.ApacheKafkaWebController

In the above Controller, I have written three RestAPIs. The first one will be used to produce a message and will be sent corresponding to two topics we discussed (payment and cart). The second API will be used to get all the messages that are consumed by topic ‘payment’ and the third API will be used to get all the messages that are consumed by topic ‘cart’.

Run your application

http://localhost:8080/swagger-ui.html

Try this API to see the final result. You can download the code from https://github.com/codeWithUtkarsh/kafka-sample-with-docker.

Write to me if you are facing any issues at utkarshkviim@gmail.com or
https://www.linkedin.com/in/utkarsh-sharma-53362836/

--

--

Utkarsh Sharma

Cloud Enthusiast and Open Source Contributor | Java Microservices Developer | 6X Azure Certified | MCT | Love for Hiking ❤