📘 Premium Read: Access my best content on Medium member-only articles — deep dives into Java, Spring Boot, Microservices, backend architecture, interview preparation, career advice, and industry-standard best practices.
🎓 Top 15 Udemy Courses (80-90% Discount): My Udemy Courses - Ramesh Fadatare — All my Udemy courses are real-time and project oriented courses.
▶️ Subscribe to My YouTube Channel (176K+ subscribers): Java Guides on YouTube
▶️ For AI, ChatGPT, Web, Tech, and Generative AI, subscribe to another channel: Ramesh Fadatare on YouTube
Prerequisites
- JDK 17 or later
- Maven
- Docker and Docker Compose installed on your machine
- IDE (IntelliJ IDEA, Eclipse, etc.)
Step 1: Set Up a Spring Boot Project
Use Spring Initializr to create a new project with the following configuration:
- Project: Maven Project
- Language: Java
- Spring Boot: 3.2.x
- Dependencies: Spring Web, Spring for Apache Kafka
Download and unzip the project, then open it in your IDE.
Example Spring Boot Application
Create a simple Spring Boot application that interacts with Kafka.
1.1 Application Class
package com.example.demo; import org.springframework.boot.SpringApplication; import org.springframework.boot.autoconfigure.SpringBootApplication; @SpringBootApplication public class DemoApplication { public static void main(String[] args) { SpringApplication.run(DemoApplication.class, args); } }
1.2 Kafka Producer Configuration
Create a configuration class for the Kafka producer in the com.example.demo.config
package.
package com.example.demo.config; import org.apache.kafka.clients.producer.ProducerConfig; import org.apache.kafka.common.serialization.StringSerializer; import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; import org.springframework.kafka.core.DefaultKafkaProducerFactory; import org.springframework.kafka.core.KafkaTemplate; import org.springframework.kafka.core.ProducerFactory; import org.springframework.kafka.support.serializer.JsonSerializer; import java.util.HashMap; import java.util.Map; @Configuration public class KafkaProducerConfig { @Bean public ProducerFactory<String, String> producerFactory() { Map<String, Object> configProps = new HashMap<>(); configProps.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092"); configProps.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class); configProps.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, JsonSerializer.class); return new DefaultKafkaProducerFactory<>(configProps); } @Bean public KafkaTemplate<String, String> kafkaTemplate() { return new KafkaTemplate<>(producerFactory()); } }
1.3 Kafka Consumer Configuration
Create a configuration class for the Kafka consumer in the com.example.demo.config
package.
package com.example.demo.config; import org.apache.kafka.clients.consumer.ConsumerConfig; import org.apache.kafka.common.serialization.StringDeserializer; import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; import org.springframework.kafka.annotation.EnableKafka; import org.springframework.kafka.config.ConcurrentKafkaListenerContainerFactory; import org.springframework.kafka.core.ConsumerFactory; import org.springframework.kafka.core.DefaultKafkaConsumerFactory; import org.springframework.kafka.support.serializer.ErrorHandlingDeserializer; import org.springframework.kafka.support.serializer.JsonDeserializer; import java.util.HashMap; import java.util.Map; @EnableKafka @Configuration public class KafkaConsumerConfig { @Bean public ConsumerFactory<String, String> consumerFactory() { Map<String, Object> configProps = new HashMap<>(); configProps.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092"); configProps.put(ConsumerConfig.GROUP_ID_CONFIG, "group_id"); configProps.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, ErrorHandlingDeserializer.class); configProps.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, ErrorHandlingDeserializer.class); configProps.put(ErrorHandlingDeserializer.VALUE_DESERIALIZER_CLASS, JsonDeserializer.class.getName()); configProps.put(JsonDeserializer.TRUSTED_PACKAGES, "*"); return new DefaultKafkaConsumerFactory<>(configProps); } @Bean public ConcurrentKafkaListenerContainerFactory<String, String> kafkaListenerContainerFactory() { ConcurrentKafkaListenerContainerFactory<String, String> factory = new ConcurrentKafkaListenerContainerFactory<>(); factory.setConsumerFactory(consumerFactory()); return factory; } }
1.4 Kafka Producer Service
Create a service class for the Kafka producer in the com.example.demo.service
package.
package com.example.demo.service; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.kafka.core.KafkaTemplate; import org.springframework.stereotype.Service; @Service public class KafkaProducerService { @Autowired private KafkaTemplate<String, String> kafkaTemplate; private static final String TOPIC = "test_topic"; public void sendMessage(String message) { kafkaTemplate.send(TOPIC, message); } }
1.5 Kafka Consumer Service
Create a service class for the Kafka consumer in the com.example.demo.service
package.
package com.example.demo.service; import org.springframework.kafka.annotation.KafkaListener; import org.springframework.stereotype.Service; @Service public class KafkaConsumerService { @KafkaListener(topics = "test_topic", groupId = "group_id") public void consume(String message) { System.out.println("Consumed message: " + message); } }
1.6 REST Controller
Create a MessageController
class in the com.example.demo.controller
package to send messages to Kafka.
package com.example.demo.controller; import com.example.demo.service.KafkaProducerService; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.web.bind.annotation.PostMapping; import org.springframework.web.bind.annotation.RequestParam; import org.springframework.web.bind.annotation.RestController; @RestController public class MessageController { @Autowired private KafkaProducerService kafkaProducerService; @PostMapping("/send") public String sendMessage(@RequestParam("message") String message) { kafkaProducerService.sendMessage(message); return "Message sent to Kafka successfully"; } }
1.7 application.properties
Configuration
Configure your application to use Kafka. In the src/main/resources
directory, create or update the application.properties
file.
# src/main/resources/application.properties spring.kafka.bootstrap-servers=localhost:9092 spring.kafka.consumer.group-id=group_id spring.kafka.consumer.auto-offset-reset=earliest spring.kafka.consumer.key-deserializer=org.apache.kafka.common.serialization.StringDeserializer spring.kafka.consumer.value-deserializer=org.apache.kafka.common.serialization.StringDeserializer spring.kafka.producer.key-serializer=org.apache.kafka.common.serialization.StringSerializer spring.kafka.producer.value-serializer=org.apache.kafka.common.serialization.StringSerializer
Step 2: Create Docker Compose Configuration
Docker Compose allows you to define and run multi-container Docker applications. You will create a docker-compose.yml
file to define the services for Kafka, Zookeeper, and your Spring Boot application.
2.1 Create docker-compose.yml
Create a docker-compose.yml
file in the root directory of your project.
version: '3.8' services: zookeeper: image: confluentinc/cp-zookeeper:7.0.1 environment: ZOOKEEPER_CLIENT_PORT: 2181 ZOOKEEPER_TICK_TIME: 2000 ports: - "2181:2181" kafka: image: confluentinc/cp-kafka:7.0.1 depends_on: - zookeeper ports: - "9092:9092" environment: KAFKA_BROKER_ID: 1 KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181 KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://localhost:9092 KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR: 1 app: image: demo-app build: context: . dockerfile: Dockerfile ports: - "8080:8080" environment: SPRING_KAFKA_BOOTSTRAP_SERVERS: kafka:9092 depends_on: - kafka
Explanation:
zookeeper
: Defines the Zookeeper service required by Kafka.kafka
: Defines the Kafka service.depends_on
: Ensures the Zookeeper service is started before Kafka.environment
: Sets the environment variables for Kafka.
app
: Defines the Spring Boot application service.depends_on
: Ensures the Kafka service is started before the Spring Boot application.environment
: Sets the Kafka bootstrap server for the Spring Boot application.
2.2 Create a Dockerfile
Create a Dockerfile
in the root directory of your project.
# Use the official OpenJDK base image FROM openjdk:17-jdk-alpine # Set the working directory inside the container WORKDIR /app # Copy the built jar file into the container COPY target/demo-0.0.1-SNAPSHOT.jar app.jar # Expose port 8080 EXPOSE 8080 # Run the application ENTRYPOINT ["java", "-jar", "app.jar"]
Step 3: Build and Run the Application
3.1 Build the Jar File
Run the following command to build the jar file of your Spring Boot application:
./mvnw clean package
3.2 Build and Run Docker Compose
Run the following command to build and start the Docker Compose services:
docker-compose up --build
3.3 Verify the Application
Open a web browser or a tool like Postman and navigate to the following URL to test the
endpoints:
- Send a message to Kafka:
- URL:
http://localhost:8080/send?message=HelloKafka
- Method:
POST
- URL:
Check the console output to see the consumed message:
Consumed message: HelloKafka
Conclusion
In this tutorial, you have learned how to set up and run a Spring Boot application with Apache Kafka using Docker Compose. We covered:
- Setting up a Spring Boot project with Kafka.
- Creating Kafka producer and consumer configurations.
- Creating services to produce and consume Kafka messages.
- Creating a Dockerfile for the Spring Boot application.
- Creating a
docker-compose.yml
file to define the services. - Building and running the application using Docker Compose.
By following these steps, you can easily manage and deploy your Spring Boot application and its dependencies using Docker Compose, enabling seamless interaction with Apache Kafka.
Comments
Post a Comment
Leave Comment