Integrate Apache Kafka with jBPM

💡 Introduction
In modern enterprise systems, event-driven architecture is essential for building reactive, scalable, and loosely coupled workflows.
⚙️ jBPM and 📡 Apache Kafka together enable seamless communication between business processes and external systems in real time.

Kafka acts as the 🧩 message broker while jBPM executes 🧠 business process logic based on produced or consumed events.


🔍 Why Integrate jBPM with Kafka

Feature⚙️ jBPM📡 Kafka
Workflow engine
Event streaming
Persistence & auditOptional
Real-time communicationPossible via JMS/RESTNative
Scalability / DecouplingModerateExcellent

Together they enable:

  • 🔄 Real-time process triggers

  • 📬 Event notifications between systems

  • 🧱 Decoupled producer-consumer setup

  • 🚀 Highly scalable automation


🏗️ Architecture Overview

Producer App → Kafka Topic → jBPM Consumer → Process Execution ↘ ↗ jBPM Producer → Kafka Topic → Other Microservices
  • 🏭 Producer: sends process events (e.g. “Order Completed”)

  • 🎧 Consumer: listens for messages and starts new process instances


⚙️ Implementation Steps

1️⃣ Setup Kafka

docker run -d --name zookeeper -p 2181:2181 zookeeper docker run -d --name kafka -p 9092:9092 \ -e KAFKA_ZOOKEEPER_CONNECT=host.docker.internal:2181 \ -e KAFKA_ADVERTISED_LISTENERS=PLAINTEXT://localhost:9092 \ wurstmeister/kafka kafka-topics.sh --create --topic jbpm-events --bootstrap-server localhost:9092

2️⃣ Add Kafka Dependency

<dependency> <groupId>org.apache.kafka</groupId> <artifactId>kafka-clients</artifactId> <version>3.7.0</version> </dependency>

3️⃣ Kafka Producer Class 💬

import org.apache.kafka.clients.producer.*; import java.util.Properties; public class KafkaEventProducer { private final KafkaProducer<String, String> producer; private final String topic = "jbpm-events"; public KafkaEventProducer() { Properties props = new Properties(); props.put("bootstrap.servers", "localhost:9092"); props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer"); props.put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer"); producer = new KafkaProducer<>(props); } public void sendEvent(String key, String message) { producer.send(new ProducerRecord<>(topic, key, message)); System.out.println("Sent to Kafka: " + message); } } You can call this class inside a Service Task or Task Event Listener in jBPM.

4️⃣ Kafka Consumer Class 🎧

import org.apache.kafka.clients.consumer.*; import org.kie.api.runtime.KieSession; import java.time.Duration; import java.util.Collections; import java.util.Properties; public class KafkaEventConsumer { private final KieSession ksession; public KafkaEventConsumer(KieSession ksession) { this.ksession = ksession; } public void startConsumer() { Properties props = new Properties(); props.put("bootstrap.servers", "localhost:9092"); props.put("group.id", "jbpm-group"); props.put("key.deserializer", "org.apache.kafka.common.serialization.StringDeserializer"); props.put("value.deserializer", "org.apache.kafka.common.serialization.StringDeserializer"); KafkaConsumer<String, String> consumer = new KafkaConsumer<>(props); consumer.subscribe(Collections.singletonList("jbpm-events")); new Thread(() -> { while (true) { ConsumerRecords<String, String> records = consumer.poll(Duration.ofSeconds(1)); for (ConsumerRecord<String, String> record : records) { System.out.println("Received message: " + record.value()); // Example: start process with the message ksession.startProcess("com.sample.MyProcess", Collections.singletonMap("inputMessage", record.value())); } } }).start(); } }


5️⃣ Use Kafka in BPMN Flow

  • Add a Service Task named “Send to Kafka”.

    Bind its Java class to KafkaEventProducer.

    Pass process variables (like orderId, status) as parameters.

    You can also attach a Signal Event in your BPMN that starts a process whenever a Kafka message arrives (via the consumer thread).


🧪 End-to-End Example

🛒 When an order is placed:

  • A producer microservice publishes it to Kafka

  • jBPM consumes it → starts “Order Validation” workflow → sends back result

Result: Event-driven automation between microservices and BPM engine.


📈 Monitoring & Scaling

Use Kafka consumer groups for horizontal scaling of jBPM consumers.
Use OpenTelemetry or Prometheus to monitor Kafka and jBPM performance.
Configure offset commits carefully to avoid duplicate process starts.


🔐 Security & Reliability

  • 🔒 Use SSL/SASL for secure Kafka connections

  • ⚖️ Enable transactional producers for exactly once semantics

  • 💾 Persist message offsets or process correlation IDs in a database for resilience.


👉 Watch Integrate Apache Kafka with jBPM in Action better:

Here's a quick video to help you understand Integrate Apache Kafka with jBPM in Action better: Coming soon 


🏁 Conclusion

By integrating Apache Kafka with jBPM, you transform static workflows into dynamic, event-driven processes.

This combination supports real-time decision making, scalable automation, and seamless microservice orchestration — a perfect fit for modern cloud architectures.


💼 Professional Support Available

If you are facing issues in real projects related to enterprise backend development or workflow automation, I provide paid consulting, production debugging, project support, and focused trainings.

Technologies covered include Java, Spring Boot, PL/SQL, Azure, and workflow automation (jBPM, Camunda BPM, RHPAM).

📧 Contact: ishikhanirankari@gmail.com | info@realtechnologiesindia.com


Comments

Popular posts from this blog

jBPM Installation Guide: Step by Step Setup

Scopes of Signal in jBPM

OOPs Concepts in Java | English | Object Oriented Programming Explained