How to connect BE service to Local Kafka
Background
In the Navida Pro backend architecture, services communicate using:
- HTTP (synchronous)
- Kafka (asynchronous)
Kafka is primarily used for non-blocking async communication and persistent event processing in higher environments. Initially, each microservice managed its own Kafka utility code — leading to duplication and inconsistency.
To address this, all reusable Kafka components were centralized and migrated to the Open Contract library.
What Kafka Utilities Are in Open Contract?
Open Contract now provides standardized Kafka support for all services, including:
- Common Kafka producer and consumer configurations
- Shared serializers and deserializers
- Utility methods for message publishing
- Property-driven behavior switches for different environments
This ensures consistency, reusability, and simplifies onboarding Kafka across services.
How to Run Kafka Locally for Development
Step 1: Download and Extract Kafka
Download and extract the Kafka locally.
Step 2: Start Zookeeper and Kafka Server
# Start Zookeeper
bin/zookeeper-server-start.sh config/zookeeper.properties
# In a new terminal, start Kafka server
bin/kafka-server-start.sh config/server.properties
Kafka is now up and running on your machine.
Step 3: Configure Your Spring Boot Service
In your application.properties
or application.yml
, set the following properties:
spring.profiles.active=local
aok.data.process.method=kafka-implementation
Why
spring.profiles.active=local
?
This setting tells your application to use local Kafka configuration. Higher environments require certificates and keys, which are not required or configured locally. A custom configuration bean in Open Contract ensures the correct Kafka setup is loaded when local
is active.
Why
aok.data.process.method=kafka-implementation
?
We have a conditional check in the business logic to switch between:
data-process-implementation
→ uses Feign client (sync)kafka-implementation
→ uses Kafka (async)
Set this to kafka-implementation
to ensure all write operations go through Kafka.
Summary for Local Kafka Setup
- Integrate Open Contract (see integration guide)
- Start local Kafka server (Zookeeper + Kafka broker)
- Ensure these properties are configured:
spring.profiles.active=local
aok.data.process.method=kafka-implementation
Once these steps are complete, your service will successfully connect to Kafka running on your local machine.
Final Note
Centralizing Kafka logic into Open Contract has simplified Kafka integration across all backend services. It enables:
- Easier onboarding for developers
- Cleaner codebase
- Environment-specific flexibility
- Improved debugging and observability