Integrate Kafka with Java
💡 Introduction
In modern enterprise applications, real-time communication between distributed systems is crucial.
Apache Kafka — the powerful distributed streaming platform — enables high-throughput, fault-tolerant event-driven architectures.
In this blog, we’ll see how to integrate Kafka with both Spring Boot and classic Spring Framework, using practical examples for Producer and Consumer applications.
⚙️ 1️⃣ What Is Apache Kafka?
Apache Kafka is a distributed event streaming platform designed for:
-
Publishing and subscribing to data streams
-
Processing real-time data
-
Connecting systems using reliable message queues
Kafka is often used for:
-
Microservice communication
-
Log aggregation
-
Real-time analytics
-
BPM and workflow triggers (like jBPM event integration)
🧩 2️⃣ Core Concepts
| Component | Description |
|---|---|
| Producer | Sends (publishes) messages to Kafka topics |
| Consumer | Reads messages from Kafka topics |
| Topic | A category or feed name to which messages are published |
| Broker | Kafka server that stores and serves messages |
| Zookeeper | Coordinates Kafka brokers (not needed in latest Kafka versions) |
🏗️ 3️⃣ Kafka Setup (Local or Docker)
🐳 Using Docker
🧾 Create a Topic
🧠 4️⃣ Kafka Integration with Spring Boot
🧱 Step 1: Add Dependencies
Add these to your pom.xml:
📦 Step 2: Application Configuration
🧩 application.yml
💬 Step 3: Create a Producer Service
🎧 Step 4: Create a Consumer Listener
🚀 Step 5: REST Controller to Publish Message
✅ Test:
Run the app → hit
POST http://localhost:8080/api/kafka/publish?message=HelloKafka
Observe the logs in both Producer and Consumer consoles.
⚙️ 5️⃣ Integration with Classic Spring (Non-Boot)
If you’re using Spring Framework without Boot, define beans manually in your XML or Java configuration.
🧾 kafka-config.xml
Then inject and use KafkaTemplate the same way as in Spring Boot:
🔄 6️⃣ Two-Way Communication (Request-Reply)
Kafka supports request-reply using correlation IDs. Example:
Consumers can reply to another topic:
🧰 7️⃣ Key Tips for Production
| Area | Best Practice |
|---|---|
| ✅ Error Handling | Use SeekToCurrentErrorHandler for retries |
| 🔁 Idempotency | Enable enable.idempotence=true for exactly-once semantics |
| 🔐 Security | Use SASL/SSL configs for cloud brokers |
| ⚙️ Monitoring | Integrate Prometheus / Micrometer with Spring Actuator |
| 📦 Batch Processing | Use @KafkaListener(batch = true) for high throughput |
📈 8️⃣ Architecture Diagram
🧠 9️⃣ Common Errors
| Error | Reason | Fix |
|---|---|---|
bootstrap broker disconnected | Kafka not running | Start Kafka broker (9092) |
TimeoutException | Wrong broker endpoint | Check bootstrap-servers URL |
SerializationException | Wrong serializer | Match key/value serializer types |
Consumer rebalance loop | No group.id or topic mismatch | Add group-id in config |
👉 Watch “Integrate Apache Kafka with Java” in Action:
🎬 A step-by-step demo video coming soon on YouTube: Learn IT with Shikha
🏁 Conclusion
🎯 By integrating Apache Kafka with Spring Boot and Spring Framework, you can build event-driven microservices that are scalable, reliable, and fast.
This setup helps your apps:
-
Stream real-time data
-
Connect asynchronously
-
Handle millions of events with minimal latency
💬 Kafka + Spring = Reactive, Resilient, and Real-time Architecture 🚀
💼 Professional Support Available
If you are facing issues in real projects related to enterprise backend development or workflow automation, I provide paid consulting, production debugging, project support, and focused trainings.
Technologies covered include Java, Spring Boot, PL/SQL, Azure, and workflow automation (jBPM, Camunda BPM, RHPAM).
📧 Contact: ishikhanirankari@gmail.com | info@realtechnologiesindia.com
🌐 Website: IT Trainings | Digital metal podium
Comments
Post a Comment