DEV Community

Discussion on: Why Event Sourcing is a microservice communication anti-pattern

Collapse
 
niktv profile image
Nik Todorov

Hi Oliver,
I would like to hear your thoughts on "Event streaming as the source of truth":
thoughtworks.com/radar/techniques/...
I found it in the book "Designing Event-Driven Systems" by Ben Stopford, where he shares the concept for the "data outside", and how CQRS and ES implemented with Kafka streams and Materialised views can help you share data within your organization - I assume he means here outside of its bounded context, I could be wrong.
From the book:
"Summary
This chapter introduced the analogy that stream processing can be viewed as a database turned inside out, or unbundled. In this analogy, responsibility for data storage (the log) is segregated from the mechanism used to query it (the Stream Processing API). This makes it possible to create views and embed them exactly where they are needed—in another application, in another geography, or on another platform. There are two main drivers for pushing data to code in this way:
• As a performance optimization, by making data local
• To decouple the data in an organization, but keep it close to a single shared source of truth
So at an organizational level, the pattern forms a kind of database of databases where a single repository of event data is used to feed many views, and each view can flex with the needs of that particular team.
"

Sounds a bit opposite to what you are sharing, but that could be me as new with the Event-Sourcing.
What I'm struggling to understand is the comparison of the persistence layers, normal relational database IMO will be the bottleneck if you want to share between services - all the monolith problems with how to evolve the schema, etc.
But in cases like Kafka when you have a very scalable event store, with schema registry (validations and versioning) - well that removes the most painful problems.
Even if you share only Bussiness Events you still will need to have a good event schema versioning solution, and if you have one then maybe this eliminates the need to keep the event sourcing only local. If you expose it, now any new interested consumer/service will have access either to a fast snapshot or to the full history and consume it without asking the service for data (no need to know how to consume the service). Versus if the new actor has access only to the business event and needs additional data elements it will need to query (rest, etc) the service for that. Which makes it aware of the service and how to communicate with it. Potentially if the service does not expose the data elements the new service needs, it will have to introduce a new API change. Worst case if the data is needed as ES, as a log for replaying (AI, etc) then you have to expose your internal ES store vs API.

Maybe I'm missing something and your opinion is welcome!