In this tutorial, we’ll explore how to implement streaming responses in a Next.js application. We’ll use JavaScript generators (function*
and yield
) and the ReadableStream
API to handle streaming data from a server to a client. This approach is particularly useful for scenarios like chatbots, real-time data updates, or any application where data is generated incrementally.
We’ll break down the concepts of function*
, yield
, and ReadableStream
, and explain how they work together to enable streaming. By the end of this tutorial, you’ll have a clear understanding of how to implement streaming in your Next.js applications.
For more tutorials and updates, feel free to visit my personal portfolio arfat.app or check out my GitHub profile at https://github.com/arfat-xyz.
What is Streaming?
Streaming is a technique where data is sent in chunks (small pieces) from the server to the client, rather than sending the entire response at once. This is useful for:
- Real-time updates: Sending data as it becomes available.
- Large datasets: Avoiding memory issues by processing data incrementally.
- Improved user experience: Providing immediate feedback to users.
In our example, we’ll stream responses from a chatbot API that generates text incrementally.
Key Concepts
1. Generators (function*
)
Generators are special functions in JavaScript that can pause and resume their execution. They are defined using the function*
syntax. When called, a generator returns an iterator, which can be used to control the execution of the function.
-
Why use generators?
- They allow you to produce a sequence of values over time.
- They are memory-efficient because they generate values on-demand.
Example:
function* simpleGenerator() {
yield "Hello";
yield "World";
}
const generator = simpleGenerator();
console.log(generator.next().value); // "Hello"
console.log(generator.next().value); // "World"
2. yield
The yield
keyword is used inside a generator to pause the function and return a value. When the generator is resumed, it continues execution from where it was paused.
-
Why use
yield
?- It allows you to produce values incrementally.
- It works seamlessly with iterators and asynchronous operations.
3. ReadableStream
The ReadableStream
API is part of the Web Streams API, which provides a standard interface for streaming data. It allows you to read chunks of data as they become available.
-
Why use
ReadableStream
?- It is designed for handling streaming data efficiently.
- It integrates well with modern web APIs like
fetch
.
Implementation: Streaming in Next.js
Let’s break down the implementation into two parts: the server-side API route and the client-side React hook.
Server-Side: API Route
The server-side code handles the streaming logic. It uses a generator (function*
) to produce chunks of data and a ReadableStream
to send those chunks to the client.
import { NextResponse } from "next/server";
import { routeErrorHandler } from "@/lib/api-response";
import { inputSchema } from "@/lib/zod-validation";
import { mistralClient } from "@/utils/mistal";
import { EventStream } from "@mistralai/mistralai/lib/event-streams";
import { CompletionEvent } from "@mistralai/mistralai/models/components";
export async function POST(request: Request) {
try {
// Parse the JSON request body
const body = await request.json();
// Validate the input using Zod schema
const { inputText: content } = inputSchema.parse(body);
// Request a streaming response from the Mistral API
const chatResponse = await mistralClient.chat.stream({
model: "mistral-large-latest",
messages: [
{
role: "system",
content: "You are a friendly cheese connoisseur. When asked about cheese, reply concisely and humorously.",
},
{ role: "user", content }, // User's input
],
temperature: 0.7, // Controls randomness in the response
responseFormat: { type: "text" }, // Response format
});
// Convert the generator into a ReadableStream
const stream = iteratorToStream(makeIterator(chatResponse));
// Return the stream as a Next.js response
return new NextResponse(stream, {
headers: {
"Content-Type": "text/plain", // Set the content type for the stream
},
});
} catch (error) {
// Handle errors and return a formatted error response
console.log("Error", { error });
return routeErrorHandler(error);
}
}
// Generator function to produce chunks of data
async function* makeIterator(chatResponse: EventStream<CompletionEvent>) {
const encoder = new TextEncoder(); // Encoder to convert strings to Uint8Array
// Iterate over the streaming response
for await (const chunk of chatResponse) {
const newChunk = chunk?.data.choices[0].delta.content as string;
yield encoder.encode(newChunk); // Yield each chunk as a Uint8Array
}
// Signal the end of the stream
yield encoder.encode("END_STREAM");
}
// Convert a generator into a ReadableStream
function iteratorToStream(iterator: AsyncGenerator<Uint8Array>) {
return new ReadableStream({
async pull(controller) {
// Get the next value from the generator
const { value, done } = await iterator.next();
if (done) {
// Close the stream if the generator is done
controller.close();
} else {
// Enqueue the value (chunk) into the stream
controller.enqueue(value);
}
},
});
}
Client-Side: React Hook
The client-side code reads the stream from the server and updates the UI incrementally.
"use client";
import { useState } from "react";
import toast from "react-hot-toast";
function useStreamResponseHook(api: string) {
const [responses, setResponses] = useState(""); // Store the streamed responses
const [isLoading, setIsLoading] = useState(false); // Track loading state
const startStream = async (inputText: string) => {
try {
setIsLoading(true);
setResponses(""); // Clear previous responses
// Send a POST request to the API
const response = await fetch(api, {
method: "POST",
body: JSON.stringify({ inputText }),
headers: {
"Content-Type": "application/json",
},
});
// Check if the response body is readable
if (!response.body) {
toast.error("Readable is not Supported");
throw new Error("Readable is not Supported");
}
// Get a reader for the response stream
const reader = response.body.getReader();
// Function to read chunks from the stream
const read = async () => {
const { done, value } = await reader.read();
if (done) {
setIsLoading(false); // Stop loading when the stream ends
return;
}
// Decode the chunk and update the responses
const text = new TextDecoder("utf-8").decode(value);
if (!text.includes("END_STREAM")) {
setResponses((prev) => prev + text);
}
read(); // Continue reading the next chunk
};
read(); // Start reading the stream
} catch (error) {
console.log("Error from useStreamResponse hook", { error });
toast.error("An error occurred while streaming the response.");
setIsLoading(false);
}
};
return { responses, isLoading, startStream };
}
export default useStreamResponseHook;
How It Works
-
Server-Side:
- The API route receives a request, validates the input, and fetches a streaming response from the Mistral API.
- A generator (
makeIterator
) is used to produce chunks of data from the streaming response. - The
iteratorToStream
function converts the generator into aReadableStream
, which is sent to the client.
-
Client-Side:
- The
useStreamResponseHook
hook sends a request to the API and reads the stream using aReadableStreamDefaultReader
. - Each chunk is decoded and appended to the
responses
state, which updates the UI in real-time.
- The
Why Use This Approach?
- Efficiency: Data is processed and sent incrementally, reducing memory usage.
- Real-Time Updates: Users see updates as soon as they are available.
- Scalability: Works well for large datasets or long-running processes.
Conclusion
By combining function*
, yield
, and ReadableStream
, you can implement efficient and scalable streaming in your Next.js applications. This approach is particularly useful for real-time applications like chatbots, live data feeds, or any scenario where incremental updates are required. With the explanations and code provided, you should now be able to implement streaming in your own projects. Happy coding! 🚀
Feel free to connect with me on LinkedIn or reach out via email at [arfatrahman08@gmail.com](mailto:arfatrahman
08@gmail.com) for more insights and updates.
I added your links at the start and end for better visibility! Let me know if you'd like to adjust the placement further.
Top comments (0)