Event-driven microservices with Kafka (3 Part Series)
In the the previous post of the Event-driven microservices with Kafka series (see here), I showed how to extend the asynchronous event-driven communication all the way from Kafka to the Web frontend passing through the Java backend.
In that proof of concept application which processes money transfers, one Kafka topic is used to receive incoming transfer messages and a second topic is used to store the account balances state (no need of a database) and also to broadcast changes (i.e., to push notifications from the backend to the frontend).
The first topic (called "transfers") and the second topic (called "account-balances") are connected through Kafka Streams.
In this post we are bringing Uber Cadence into the mix to manage the state of the application (i.e., to keep the balance of the accounts updated), thus, Cadence replaces Kafka Streams.
Cadence is an orchestration/workflow engine which unlike most of the other workflow engines out there (e.g., Zeebe, Camunda and many others), does not rely on BPMN (nor on any other type of XML) to define a workflow.
It uses regular programming languages such as Go and Java instead, which is a huge advantage as eloquently explained by Cadence's lead developer in this comment.
It is out of the scope of this post to provide a detailed description or an introduction to Cadence, you are encouraged to perform your own (even if superficial) research and also watch the companion video for this post but I will mention its main advantages:
- It makes your application easily scalable
- It hugely simplifies the error handling
- It comes with an admin Web UI and CLI
In short, it takes care of most of the heavy lifting inherent to a distributed application.
Let's go trough the data flow to get an overview of the application functionality.
A new service called Cadence transfers recording service listens for incoming transfer messages on a Kafka topic.
If the message represents the act of opening a new account, the service creates a new workflow by using the corresponding Cadence Java client API method.
The Cadence server will assign the execution of the new workflow to one of the previously started Cadence workers. One important detail to note is that the id of the workflow matches the account id, i.e., there is one workflow per account.
If the incoming Kafka message represents a money transfer which changes the balance of an existing account, the Cadence transfers recording service calls the required method (which must be annotated with @SignalMethod) on the workflow whose id matches the account id.
That triggers the remote execution of the code on one of the Cadence workers, which after updating the value of the account balance as required, will send a Kafka message to the "account-balances" topic to announce the event.
Next, the "transfers websockets" service will pull that message from the "account-balances" Kafka topic and broadcast it to the interested Angular Web clients through websockets.
In addition to this asynchronous/push communication, an Angular client may query the "transfers websockets" service synchronously to get the current balance or the history of an account (i.e., the list of transfers for that account).
The "transfers websockets" service, in order to fullfill those requests, will query the Cadence server by calling the corresponding method which must be annotated with @QueryMethod.
This is a high-level diagram of the PoC application, I am sorry if it looks a little messy, graphic design is not my strongest skill but it should help you to get an overview of the components and their interactions.