DEV Community

Cover image for Stream messages from Stripe to Kafka
Esther Adebayo for IntegrationOS

Posted on • Updated on

Stream messages from Stripe to Kafka

Data grows exponentially across every application. That's why it's crucial to have a seamless way to send events data from different applications into a central event streaming platform like Apache Kafka.

In today's post, we'll walk through how to easily send messages or events from Stripe into Kafka using Buildable's Node SDK.

Prerequisites

Before we dive in, you'll need a few things to complete this tutorial:

  • A Buildable account
  • A Stripe account
  • Confluent for Apache Kafka
  • A Kafka server

Creating a Stripe connection

Inside your Buildable account, we need to create a Stripe connection to push messages from Stripe into Buildable. If you do not have a Buildable account yet, do not worry, you can get started for free.

Navigate to the Connection tab and hit the + New button. Here, you'll see a list of 3rd party apps, scroll down and select Stripe.

Give your connection a name, e.g., Stripe connection, and type in your Stripe API key.

Next, hit Connect and subscribe to the following events:

  • customer.created
  • customer.deleted
  • customer.updated

Creating a Stripe connection on Buildable

Setting up your Buildable Secret Key

We also need to create a Buildable secret key that we will use in our NodeJS app.

To do this, head over to the Settings page. From the left panel, select Secret Keys and create a new one.

Copy your secret key because we'll be using it shortly.

Setting up your Buildable Secret Key

Creating a NodeJS client

Go over to your Confluent account, and in a new or existing cluster, create a new topic called integrations.

To connect to our topic, let's select configure a client and choose NodeJS as the language.

Now, we create a Kafka Cluster API key. Call this key Buildable and download the file.

Creating a NodeJS client

Notice config variables show up, which we'll be using in a bit, so keep them handy.

Adding environmental variables

Now, it's time to add in our environmental variables. In our NodeJS project, create a .env file and define the environmental variables used in our app.

Do you remember the secret key you created earlier in Buildable? We'll paste it here as our BUILDABLE_SECRET_KEY.

You'll also add in the Kafka variables that showed up after you created your Kafka Cluster API key.

Environmental variables

Connecting to the Kafka Server

In order to use Kafka, we need to connect to our Kafka server. Let's install the kafkajs npm package by running npm install kafkajs.

We then create a new Kafka instance by specifying the clientId, brokers, ssl and sasl.

Finally, we create and connect to the Kafka producer that helps to publish messages to Kafka.



const { Kafka } = require("kafkajs");

// Connect to Kafka
const kafka = new Kafka({
  clientId: process.env.KAFKA_CLIENT_ID,
  brokers: [process.env.KAFKA_BROKER],
  ssl: true,
  sasl: {
    mechanism: "plain",
    username: process.env.KAFKA_USERNAME,
    password: process.env.KAFKA_PASSWORD,
  },
});

// Create the producer
const producer = kafka.producer();

// Connect to the producer
await producer.connect();


Enter fullscreen mode Exit fullscreen mode

Listening to messages from Stripe

Since messages from Stripe are being sent first to Buildable, we need to install Buildable's Node SDK into our app by running npm install @buildable/messages

Next, let's define the exact events we subscribed to in Buildable. If you recall them, they were "customer.created", "customer.deleted" and "customer.updated".

We also configure a listener that listens to the messages from Stripe.



// Select events to listen on
const events = [
  "customer.created", 
  "customer.deleted", 
  "customer.updated"
];

const listenerConfig = {
  platform: "stripe",
  label: "my-stripe",
  txKey: "kafka.message.sent",
};



Enter fullscreen mode Exit fullscreen mode

Publishing messages to Kafka

We're at a good point to set up the function that sends the messages to Kafka. Let's create an async function called main that listens to all our events and publishes them to Kafka.



async function main() {
  // Listen to all events
  events.forEach((eventName) =>
    client.on(eventName, async ({ payload, event }) => {

      // Create Kafka message...      

    }, listenerConfig)
  );
}

main().catch(console.error);


Enter fullscreen mode Exit fullscreen mode

To publish messages to the Kafka cluster, we need to use the producer.send method with topic and messages signatures.



await producer.send({
  topic: "integrations",
  messages: [
    {
      key: "unique-key",
      value: JSON.stringify({
        event: "customer.created",
        payload: {},
      }),
    },
  ],
})


Enter fullscreen mode Exit fullscreen mode

With that, we just completed this tutorial🎉

Here's the final code:



// Init the SDK
const { Kafka } = require("kafkajs");
const { createClient } = require("@buildable/messages");

const client = createClient(process.env.BUILDABLE_SECRET_KEY);

// Create Kafka Client
const kafka = new Kafka({
  clientId: process.env.KAFKA_CLIENT_ID,
  brokers: [process.env.KAFKA_BROKER],
  ssl: true,
  sasl: {
    mechanism: "plain",
    username: process.env.KAFKA_USERNAME,
    password: process.env.KAFKA_PASSWORD,
  },
});

// Select the Kafka topic
const topic = "integrations";

// Select the events to listen on
const events = [
  "customer.created", 
  "customer.deleted", 
  "customer.updated"
];

const listenerConfig = {
  platform: "stripe",
  label: "my-stripe-connection",
  txKey: "kafka.message.sent",
};

async function main() {
  // Create and connect to Kafka producer
  const producer = kafka.producer();
  await producer.connect();

  // Listen to all events
  events.forEach((eventName) =>
    client.on(eventName, async ({ payload, event }) => {
      // Create Kafka message
      return await producer.send({
        topic,
        messages: [
          {
            key: event.key,
            value: JSON.stringify({
              event: `${listenerConfig.platform}.${listenerConfig.label}.${eventName}`,
              payload,
            }),
          },
        ],
      })
    }, listenerConfig)
  );
}

main().catch(console.error);


Enter fullscreen mode Exit fullscreen mode

If you trigger an event like creating a customer in Stripe, you should now see this event show up in Kafka.

Simplify your Kafka streaming process with Buildable

Streaming messages from your 3rd party applications into Kafka via Buildable doesn’t have to be complex. In this tutorial, we learnt how to easily send Stripe messages to Kafka, but you can connect any other application to Buildable and stream directly to Kafka.

Hopefully, this example gave you the insights you need to stream messages from any application into Kafka. With our free tier Buildable account, you can begin to stream messages from multiple platforms straight into Kafka, so go ahead and explore!

If you have any questions, please reach out via Twitter or join our Discord community. Till next time, keep building with Buildable!

Top comments (0)