Kafka Integration with Liferay (Complete Enterprise Guide)

Modern portals must communicate with multiple backend systems in real time.

Instead of synchronous APIs, enterprises now use event-driven architecture.

That’s where Apache Kafka integrates perfectly with Liferay.

This guide explains how to connect Liferay with Kafka for scalable messaging.


📌 Why Integrate Kafka with Liferay?

Traditional portal integration:

User Action → REST Call → Wait Response → Slow UI

Event-Driven Integration:

User Action → Publish Event → Async Processing → Fast UI

Benefits:

  • Loose coupling

  • High scalability

  • Retry capability

  • Real-time updates


🖼️ Architecture Overview


🧠 Integration Use Cases

Common enterprise scenarios:

  • User registration triggers CRM update

  • Document upload triggers processing

  • Workflow approval triggers billing

  • Notification service events

  • Audit logging


🛠 Step 1: Add Kafka Dependency (OSGi Module)

Create Liferay module.

Add Gradle dependency:

implementation 'org.apache.kafka:kafka-clients:3.6.0'

🛠 Step 2: Kafka Producer in Liferay

Example: publish user registration event.

public class KafkaProducerUtil { private static final String TOPIC = "user-events"; public static void publish(String message) { Properties props = new Properties(); props.put("bootstrap.servers", "localhost:9092"); props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer"); props.put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer"); Producer<String, String> producer = new KafkaProducer<>(props); producer.send(new ProducerRecord<>(TOPIC, message)); producer.close(); } }

🛠 Step 3: Trigger Event from Liferay Model Listener

Example: After user creation.

@Component( immediate = true, service = ModelListener.class ) public class UserCreateListener extends BaseModelListener<User> { @Override public void onAfterCreate(User user) { String payload = "{ \"userId\": " + user.getUserId() + " }"; KafkaProducerUtil.publish(payload); } }

Now every new user publishes event.


🖼️ Event Flow


🛠 Step 4: Kafka Consumer (Microservice)

Spring Boot consumer example:

@KafkaListener(topics = "user-events", groupId = "crm-service") public void consume(String message) { System.out.println("Received: " + message); }

Backend systems now process asynchronously.


🔁 Retry & Fault Tolerance

Kafka automatically supports:

  • Retry

  • Offset tracking

  • Replay messages

Portal never blocks.


🔐 Security Considerations

Use in production:

  • SASL authentication

  • SSL encryption

  • Topic authorization

Never expose broker publicly.


🏆 Best Practices

✔ Send lightweight events
✔ Avoid large payloads
✔ Use JSON schema
✔ Separate topics per domain
✔ Handle idempotency in consumers


⚠️ Common Mistakes

❌ Calling backend API directly from portal
❌ Large payload events
❌ No retry handling
❌ Blocking UI thread


🎯 Real Enterprise Flow

User registers → Liferay publishes event
CRM consumes → creates customer
Email service consumes → sends welcome mail
Analytics consumes → logs activity

Fully decoupled.


🎯 Conclusion

Kafka + Liferay enables:

  • Real-time integration

  • Scalable portals

  • Reliable messaging

  • Microservice architecture

It is the recommended approach for modern enterprise portals.

Recommended Reading


💼 Professional Support Available

If you are facing issues in real projects related to enterprise backend development or workflow automation, I provide paid consulting, production debugging, project support, and focused trainings.

Technologies covered include Java, Spring Boot, PL/SQL, Azure, CMS and workflow automation (jBPM, Camunda BPM, RHPAM).

📧 Contact: ishikhanirankari@gmail.com | info@realtechnologiesindia.com

🌐 Website: IT Trainings | Digital metal podium     


Comments

Popular posts from this blog

Scopes of Signal in jBPM

OOPs Concepts in Java | English | Object Oriented Programming Explained

jBPM Installation Guide: Step by Step Setup