DEV Community

Leandro Lima
Leandro Lima

Posted on

Using Apache Kafka with Node.js: A Tutorial on Building Event-Driven Applications

Using Apache Kafka with Node.js: A Tutorial on Building Event-Driven Applications

In today's digital landscape, building event-driven applications that can handle real-time data is crucial. Apache Kafka, a distributed streaming platform, has become a popular choice for implementing such applications. In this tutorial, we will explore how to integrate Apache Kafka with Node.js to build scalable and reliable event-driven systems.

Why Apache Kafka?

Apache Kafka provides a robust foundation for building event-driven architectures. It offers several advantages that make it a preferred choice for handling real-time data:

  • Scalability: Kafka allows you to scale your applications horizontally, handling high message throughput with ease.
  • Reliability: With its distributed nature and built-in fault tolerance, Kafka ensures that your data is delivered reliably.
  • Durability: Kafka persists messages on disk, enabling you to retain data for a specified period or size.
  • Real-time: Kafka's publish-subscribe model allows applications to react in real-time to events and updates.

Prerequisites

Before we dive into the implementation, let's ensure that we have the necessary prerequisites:

  1. Node.js: Make sure you have Node.js installed on your machine. You can download it from the official Node.js website.

  2. Apache Kafka: Set up an Apache Kafka cluster or use an existing one. You can follow the official documentation for installation and configuration instructions.

Setting Up the Project

Let's start by setting up our Node.js project:

  1. Create a new directory for your project:
mkdir kafka-demo
cd kafka-demo
Enter fullscreen mode Exit fullscreen mode
  1. Initialize a new Node.js project:
npm init -y
Enter fullscreen mode Exit fullscreen mode
  1. Install the required dependencies:
npm install kafka-node
Enter fullscreen mode Exit fullscreen mode

Producing Messages

To produce messages to a Kafka topic, we need to create a producer. Open a new file called producer.js and add the following code:

const kafka = require('kafka-node');
const Producer = kafka.Producer;
const client = new kafka.KafkaClient();
const producer = new Producer(client);

producer.on('ready', () => {
  const payloads = [
    { topic: 'my-topic', messages: 'Hello Kafka!' },
    { topic: 'my-topic', messages: 'This is a test message.' }
  ];

  producer.send(payloads, (err, data) => {
    if (err) {
      console.error('Error producing messages:', err);
    } else {
      console.log('Messages sent:', data);
    }
    process.exit();
  });
});

producer.on('error', (err) => {
  console.error('Error connecting to Kafka:', err);
  process.exit(1);
});
Enter fullscreen mode Exit fullscreen mode

In this code, we use the kafka-node package to create a Kafka client and producer. We listen for the 'ready' event to ensure the producer is ready to send messages. We define an array of payloads that contains the topic and messages we want to produce. Finally, we call the send method to send the messages to Kafka.

To run the producer, execute the following command:

node producer.js
Enter fullscreen mode Exit fullscreen mode

Consuming Messages

Now, let's create a consumer to receive and process messages from the Kafka topic. Open a new file called consumer.js and add the following code:

const kafka = require('kafka-node');
const Consumer = kafka.Consumer;
const client = new kafka.KafkaClient();
const consumer = new Consumer(client, [{ topic: 'my-topic' }]);

consumer.on('message', (message) => {
  console.log('Received message:', message.value);
});



consumer.on('error', (err) => {
  console.error('Error connecting to Kafka:', err);
  process.exit(1);
});
Enter fullscreen mode Exit fullscreen mode

In this code, we create a Kafka consumer that subscribes to the 'my-topic' topic. We listen for the 'message' event, which is triggered whenever a new message is received. We simply log the received message to the console.

To run the consumer, execute the following command:

node consumer.js
Enter fullscreen mode Exit fullscreen mode

Conclusion

In this tutorial, we explored how to use Apache Kafka with Node.js to build event-driven applications. Apache Kafka's scalability, reliability, and real-time capabilities make it an excellent choice for handling high volumes of real-time data.

If you're interested in learning more about related topics, I recommend checking out the following blog posts:

Start building your event-driven applications with Apache Kafka and Node.js today!

Top comments (0)