Why the Vercel AI SDK Is a Game-Changer for Web Developers
Building AI features into web apps used to mean wrestling with streaming responses, managing chat state, handling different LLM providers, and figuring out server-sent events. I watched a team spend two weeks building a chat interface from scratch before discovering that Vercel had already solved all of it.
The Vercel AI SDK is a free, open-source toolkit that makes it trivial to build AI-powered user interfaces. It works with React, Next.js, Svelte, Vue, and Node.js.
What You Get
-
aicore — unified API for OpenAI, Anthropic, Google, Mistral, and more -
ai/react— React hooks for chat, completion, and object generation -
ai/rsc— React Server Components streaming support - Streaming by default — tokens appear as they are generated
- Structured output — generate typed JSON objects, not just text
- Tool calling — let the LLM call your functions
Quick Setup
npx create-next-app@latest my-ai-app
cd my-ai-app
npm install ai @ai-sdk/openai
Build a Streaming Chat in 2 Files
Server route (app/api/chat/route.ts):
import { openai } from "@ai-sdk/openai";
import { streamText } from "ai";
export async function POST(req: Request) {
const { messages } = await req.json();
const result = streamText({
model: openai("gpt-4o-mini"),
system: "You are a helpful coding assistant.",
messages,
});
return result.toDataStreamResponse();
}
Client component (app/page.tsx):
"use client";
import { useChat } from "ai/react";
export default function Chat() {
const { messages, input, handleInputChange, handleSubmit, isLoading } =
useChat();
return (
<div className="max-w-2xl mx-auto p-4">
<div className="space-y-4 mb-4">
{messages.map((m) => (
<div
key={m.id}
className={m.role === "user" ? "text-right" : "text-left"}
>
<span className="font-bold">
{m.role === "user" ? "You" : "AI"}
</span>
<p>{m.content}</p>
</div>
))}
</div>
<form onSubmit={handleSubmit} className="flex gap-2">
<input
value={input}
onChange={handleInputChange}
placeholder="Ask anything..."
className="flex-1 border rounded p-2"
/>
<button
type="submit"
disabled={isLoading}
className="bg-blue-500 text-white px-4 py-2 rounded"
>
Send
</button>
</form>
</div>
);
}
That is it. You have a fully streaming chat interface with proper state management, loading indicators, and error handling.
Structured Output (Generate Typed Objects)
This is incredibly powerful — generate structured data, not just text:
import { openai } from "@ai-sdk/openai";
import { generateObject } from "ai";
import { z } from "zod";
const { object } = await generateObject({
model: openai("gpt-4o-mini"),
schema: z.object({
recipe: z.object({
name: z.string(),
ingredients: z.array(
z.object({
item: z.string(),
amount: z.string(),
})
),
steps: z.array(z.string()),
prepTime: z.number().describe("Prep time in minutes"),
}),
}),
prompt: "Generate a recipe for chocolate chip cookies",
});
console.log(object.recipe.name);
// "Classic Chocolate Chip Cookies"
console.log(object.recipe.ingredients);
// [{ item: "flour", amount: "2.25 cups" }, ...]
Tool Calling
Let the AI call your functions:
import { openai } from "@ai-sdk/openai";
import { generateText, tool } from "ai";
import { z } from "zod";
const result = await generateText({
model: openai("gpt-4o-mini"),
tools: {
getWeather: tool({
description: "Get current weather for a city",
parameters: z.object({
city: z.string(),
}),
execute: async ({ city }) => {
// Call your weather API
return { temp: 72, condition: "sunny", city };
},
}),
searchProducts: tool({
description: "Search product catalog",
parameters: z.object({
query: z.string(),
maxPrice: z.number().optional(),
}),
execute: async ({ query, maxPrice }) => {
// Query your database
return [{ name: "Widget", price: 9.99 }];
},
}),
},
prompt: "What is the weather in Tokyo and find me products under $20?",
});
Switch Providers in One Line
import { openai } from "@ai-sdk/openai";
import { anthropic } from "@ai-sdk/anthropic";
import { google } from "@ai-sdk/google";
// Just swap the model — everything else stays the same
const result = await streamText({
model: anthropic("claude-sonnet-4-20250514"), // or openai("gpt-4o") or google("gemini-pro")
messages,
});
Why Vercel AI SDK vs Rolling Your Own
| Feature | DIY | Vercel AI SDK |
|---|---|---|
| Streaming chat UI | 2-3 days | 30 minutes |
| Provider switching | Major refactor | 1 line change |
| Structured output | Custom parsing | Type-safe with Zod |
| Error handling | Manual | Built-in |
| Loading states | Manual | Automatic |
The Bottom Line
The Vercel AI SDK removes all the friction from building AI-powered web applications. Streaming, state management, multi-provider support, structured output, and tool calling — all free, all production-ready.
Start here: sdk.vercel.ai
💡 Need web scraping or data extraction? Check out my Apify actors or email me at spinov001@gmail.com for custom solutions!
Top comments (0)