DEV Community

Cover image for How to Stream Responses from the Langflow API in Node.js
Phil Nash for DataStax

Posted on • Originally published at datastax.com

2

How to Stream Responses from the Langflow API in Node.js

Building flows and AI agents in Langflow is one of the fastest ways to experiment with generative AI. Once you've built your flow, you’ll want to integrate it into your own application. Langflow exposes an API for this; we’ve written before about how to use it in Node.js. We've also seen that streaming GenAI outputs makes for a better user experience. So today, we're going to combine the two and show you how to stream results from your Langflow flows in Node.js.

Example code for using the JavaScript Langflow client. It reads:  import { LangflowClient } from

Using the Langflow client

The easiest way to use the Langflow API is with the @datastax/langflow-client npm module. You can get started with the client by installing the module with npm:

npm install @datastax/langflow-client
Enter fullscreen mode Exit fullscreen mode

The Langflow client can be used with both self-hosted and DataStax-hosted Langflow. You can see in-depth examples of how to set it up for either version of Langflow in this blog post. But the quick version is that for either type of Langflow, you start by importing the client:

import { LangflowClient } from "@datastax/langflow-client";
Enter fullscreen mode Exit fullscreen mode

For self-hosted Langflow you need the URL where you’re hosting Langflow and, if you've set up user authorisation, an API key. You then initialise the client with both:

const baseURL = "http://localhost:7860";
const apiKey = "YOUR_API_KEY";
const client = new LangflowClient({ baseURL, apiKey });
Enter fullscreen mode Exit fullscreen mode

For DataStax-hosted Langflow, you need your Langflow ID and to generate an API key. Then you create a client with the following code:

const langflowId = "YOUR_LANGFLOW_ID";
const apiKey = "YOUR_API_KEY";
const client = new LangflowClient({ langflowId, apiKey });
Enter fullscreen mode Exit fullscreen mode

Streaming with the Langflow client

To stream through the API, you need a flow that’s set up for streaming responses. A streaming flow needs a model with streaming capabilities and the stream flag turned on, connected to a chat output. The basic prompting example, with streaming turned on, is a good example of this.

A screenshot of a Langflow flow with a chat input and prompt that are both connected to the OpenAI model. The OpenAI model component has the Stream setting enabled and it is connected to a Chat Output component.

If you don't already have a flow, you can use the basic prompting flow as an example.

Once you have your flow in place, open the API modal and get the flow ID.

A screenshot of the Langflow API modal. It shows the API URL and points out that the flow ID can be found in the URL after api/v1/run.

With the flow ID and the Langflow client, you can create a flow object:

const flowId = "YOUR_FLOW_ID";
const flow = client.flow(flowId);
Enter fullscreen mode Exit fullscreen mode

To stream a response from the flow, you can use the [stream function](https://www.npmjs.com/package/@datastax/langflow-client#streaming). The response is a ReadableStream that you can iterate asynchronously over.

const response = await flow.stream("Hello, how are you?");
for await (const event of response) {
  console.log(event);
}
Enter fullscreen mode Exit fullscreen mode

There are three types of event that the stream emits; this is what each of them means:

  • add_message: a message has been added to the chat. It can refer to a human input message or a response from an AI.

  • token: a token has been emitted as part of a message being generated by the model.

  • end: all tokens have been returned; this message will also contain the same full response that you get from a non-streaming request

If you want to log out just the text from a flow response you can do the following:

const response = await flow.stream("Hello, how are you?");
for await (const event of response) {
  if (event.event === "token") {
    console.log(event.data.chunk);
  }
}
Enter fullscreen mode Exit fullscreen mode

An animation of a terminal program running the code that logs each chunk. It logs its response one word at a time, but quickly.

The stream function takes all the same arguments as the run function, so you can provide tweaks for your components, too.

Integrating with Express

If you want to make an API request from an Express server and then stream it to your own front-end, you can do the following:

app.get("/stream", async (_req, res) => {
  res.set("Content-Type", "text/plain");
  res.set("Transfer-Encoding", "chunked");

  const response = await flow.stream("Hello, how are you?");

  for await (const event of response) {
    if (event.event === "token") {
      res.write(event.data.chunk);
    }
  }

  res.end();
});
Enter fullscreen mode Exit fullscreen mode

We explored how you can handle a stream on the front-end in this blog post.

Stream your flows

Langflow enables you to rapidly build, experiment with, and deploy GenAI applications and with the JavaScript Langflow client you can easily stream those responses in your JavaScript applications.

Please do try out the Langflow client; if you have any issues, please raise them on the GitHub repo. If you're looking for more inspiration for building AI agents with Langflow, check out these posts that cover how to build an agent that can manage your calendar with Langflow and Composio or see how you can build local agents with Langflow and Ollama.

Hostinger image

Get n8n VPS hosting 3x cheaper than a cloud solution

Get fast, easy, secure n8n VPS hosting from $4.99/mo at Hostinger. Automate any workflow using a pre-installed n8n application and no-code customization.

Start now

Top comments (0)

Postgres on Neon - Get the Free Plan

No credit card required. The database you love, on a serverless platform designed to help you build faster.

Get Postgres on Neon

👋 Kindness is contagious

Please leave a ❤️ or a friendly comment on this post if you found it helpful!

Okay