What is Vercel AI SDK?
Vercel AI SDK is an open-source TypeScript toolkit for building AI-powered applications. It provides a unified API to work with any LLM provider (OpenAI, Anthropic, Google, Mistral, Groq) and React hooks for streaming AI responses in your frontend.
Why Vercel AI SDK?
- Free and open-source — MIT license, no vendor lock-in
- Unified API — same code works with OpenAI, Anthropic, Google, Mistral, Groq
- Streaming built-in — real-time AI responses with React hooks
- Structured output — type-safe AI responses with Zod schemas
- Tool calling — unified tool/function calling across all providers
- Next.js optimized — seamless integration with App Router
Quick Start
npx create-next-app@latest my-ai-app
cd my-ai-app
npm install ai @ai-sdk/openai
Server-Side: Route Handler
// app/api/chat/route.ts
import { openai } from '@ai-sdk/openai';
import { streamText } from 'ai';
export async function POST(req: Request) {
const { messages } = await req.json();
const result = streamText({
model: openai('gpt-4-turbo'),
messages,
system: 'You are a helpful DevOps assistant.',
});
return result.toDataStreamResponse();
}
Client-Side: Chat UI
// app/page.tsx
'use client';
import { useChat } from 'ai/react';
export default function Chat() {
const { messages, input, handleInputChange, handleSubmit, isLoading } = useChat();
return (
<div className="max-w-2xl mx-auto p-4">
{messages.map(m => (
<div key={m.id} className={m.role === 'user' ? 'text-blue-600' : 'text-gray-800'}>
<strong>{m.role}:</strong> {m.content}
</div>
))}
<form onSubmit={handleSubmit}>
<input
value={input}
onChange={handleInputChange}
placeholder="Ask about DevOps..."
className="w-full p-2 border rounded"
/>
</form>
</div>
);
}
Switch Providers with One Line
import { openai } from '@ai-sdk/openai';
import { anthropic } from '@ai-sdk/anthropic';
import { google } from '@ai-sdk/google';
import { mistral } from '@ai-sdk/mistral';
// Same code, different provider
const result = streamText({
model: anthropic('claude-3-5-sonnet-20241022'),
// model: google('gemini-1.5-pro'),
// model: mistral('mistral-large-latest'),
messages,
});
Structured Output with Zod
import { generateObject } from 'ai';
import { z } from 'zod';
const { object } = await generateObject({
model: openai('gpt-4-turbo'),
schema: z.object({
name: z.string(),
pros: z.array(z.string()),
cons: z.array(z.string()),
rating: z.number().min(1).max(10),
recommendation: z.string(),
}),
prompt: 'Review ArgoCD for GitOps deployments',
});
// object is fully typed!
console.log(object.pros); // string[]
Tool Calling
import { streamText, tool } from 'ai';
import { z } from 'zod';
const result = streamText({
model: openai('gpt-4-turbo'),
messages,
tools: {
getWeather: tool({
description: 'Get weather for a city',
parameters: z.object({ city: z.string() }),
execute: async ({ city }) => {
const res = await fetch(`https://api.weather.com/${city}`);
return res.json();
},
}),
searchDocs: tool({
description: 'Search documentation',
parameters: z.object({ query: z.string() }),
execute: async ({ query }) => {
return searchIndex(query);
},
}),
},
});
Vercel AI SDK vs Alternatives
| Feature | Vercel AI SDK | LangChain.js | OpenAI SDK |
|---|---|---|---|
| Multi-provider | Yes | Yes | OpenAI only |
| React hooks | Built-in | None | None |
| Streaming | Native | Manual | Manual |
| Type safety | Full TypeScript | Partial | Good |
| Structured output | Zod schemas | Parsers | JSON mode |
| Bundle size | ~15KB | ~200KB+ | ~50KB |
Real-World Impact
A SaaS startup built their AI chat feature with raw OpenAI SDK. Switching to Anthropic for better reasoning required rewriting the entire integration. After migrating to Vercel AI SDK: they now A/B test between 4 providers by changing one line. Development time for new AI features dropped from weeks to days.
Building AI-powered web apps? I help teams ship production AI features fast. Contact spinov001@gmail.com or explore my data tools on Apify.
Top comments (0)