If you’re building event-driven systems with Apache Kafka, you must think about data contracts early.
This post shows a practical, end-to-end Spring Boot example using:
- Apache Kafka
- Confluent Schema Registry
- Avro serialization
- PostgreSQL
- Docker Compose
👉 Full source code:
🔗 https://github.com/mathias82/kafka-schema-registry-spring-demo
🧠 Why Schema Registry + Avro?
JSON works… until it doesn’t.
Common problems in Kafka-based systems:
- breaking consumers when producers change payloads
- no schema versioning
- unclear data contracts between teams
Avro + Schema Registry solves this by:
- enforcing schema compatibility
- allowing safe schema evolution
- decoupling producers from consumers
This demo shows how to do it the right way with Spring Boot.
🏗️ Architecture Overview
Client (Postman)
|
v
Spring Boot Producer (REST)
|
v
Kafka Topic (users.v1)
|
v
Spring Boot Consumer
|
v
PostgreSQL
- Producer exposes POST /users
- Payload is converted to an Avro record
- Message is published to Kafka
- Consumer deserializes Avro and persists data to PostgreSQL
✨ What This Demo Includes
- Spring Boot Kafka Producer (Avro)
- Spring Boot Kafka Consumer (Avro)
- Confluent Schema Registry
- PostgreSQL persistence using Spring Data JPA
- Schema evolution with backward compatibility
- Docker Compose for local development
🐳 Local Setup (Kafka + Schema Registry + PostgreSQL)
Prerequisites
- Java 21
- Maven
- Docker & Docker Compose
Start infrastructure
docker compose up -d
Services started:
- Kafka → localhost:29092
- Schema Registry → http://localhost:8081
- PostgreSQL → localhost:5432
▶️ Run the Applications
Consumer
cd consumer-app
mvn spring-boot:run
Listens to users.v1 and persists messages to PostgreSQL.
Producer
cd producer-app
mvn spring-boot:run
Exposes REST endpoint.
📬 Produce an Event
curl -X POST http://localhost:8080/users \
-H "Content-Type: application/json" \
-d '{
"id": "u-1",
"email": "user@test.com",
"firstName": "John",
"lastName": "Doe",
"isActive": true,
"age": 30
}'
You’ll see:
- Avro schema registered (or validated)
- Message published to Kafka
- Consumer saving the record to PostgreSQL
🔄 Schema Evolution (The Important Part)
Avro allows safe evolution when rules are respected.
Example:
- Add a new optional field
- Provide a default value
- Keep compatibility set to BACKWARD
Schema Registry ensures:
- old consumers keep working
- new producers don’t break the system
This demo is designed to show real-world schema evolution, not toy examples.
☁️ Confluent Cloud Ready
The project also supports Confluent Cloud via Spring profiles:
- SASL/SSL
- Schema Registry API keys
- use.latest.version=true
- auto.register.schemas=false
- Perfect for CI/CD pipelines.
🔗 Source Code
👉 GitHub repository:
https://github.com/mathias82/kafka-schema-registry-spring-demo
Includes:
- Docker Compose
- Avro schemas
- Producer & Consumer apps
- PostgreSQL setup
- Postman collection
🧩 Who Is This For?
- Java & Spring Boot developers
- Kafka users moving beyond JSON
- Teams building event-driven microservices
- Anyone learning Schema Registry + Avro
⭐ Final Thoughts
This is a production-style Kafka example, not a hello-world.
If you’re serious about:
- schema contracts
- backward compatibility
- safe evolution
- real persistence
then this demo will save you a lot of trial and error.
👉 Star the repo if it helped you
👉 Fork it and adapt it to your own system
Top comments (0)