DEV Community

Alex Spinov
Alex Spinov

Posted on

Vercel AI SDK Has a Free Toolkit for Building AI-Powered UIs in React

Building a ChatGPT-like interface means handling streaming responses, managing conversation state, dealing with function calling, and connecting to different LLM providers. The Vercel AI SDK handles all of this in a few lines of code.

What Vercel AI SDK Gives You for Free

  • useChat hook — streaming chat UI in one line
  • useCompletion hook — text completion with streaming
  • streamText / generateText — server-side LLM calls
  • Tool calling — let LLMs execute functions
  • Multi-provider — OpenAI, Anthropic, Google, Mistral, Ollama, and more
  • Structured output — type-safe JSON from LLMs with Zod schemas
  • Works with Next.js, SvelteKit, Nuxt, SolidStart, Express

Quick Start

npm install ai @ai-sdk/openai
Enter fullscreen mode Exit fullscreen mode

Chat Interface in 10 Lines

Server (Next.js API route):

// app/api/chat/route.ts
import { openai } from '@ai-sdk/openai';
import { streamText } from 'ai';

export async function POST(req: Request) {
  const { messages } = await req.json();

  const result = streamText({
    model: openai('gpt-4o'),
    messages
  });

  return result.toDataStreamResponse();
}
Enter fullscreen mode Exit fullscreen mode

Client (React component):

'use client';
import { useChat } from 'ai/react';

export default function Chat() {
  const { messages, input, handleInputChange, handleSubmit } = useChat();

  return (
    <div>
      {messages.map(m => (
        <div key={m.id}>
          <strong>{m.role}:</strong> {m.content}
        </div>
      ))}
      <form onSubmit={handleSubmit}>
        <input value={input} onChange={handleInputChange} />
        <button>Send</button>
      </form>
    </div>
  );
}
Enter fullscreen mode Exit fullscreen mode

That's a fully functional streaming chat interface. Messages stream in real-time.

Structured Output (Type-Safe AI Responses)

import { openai } from '@ai-sdk/openai';
import { generateObject } from 'ai';
import { z } from 'zod';

const { object } = await generateObject({
  model: openai('gpt-4o'),
  schema: z.object({
    recipe: z.object({
      name: z.string(),
      ingredients: z.array(z.object({
        item: z.string(),
        amount: z.string()
      })),
      steps: z.array(z.string()),
      cookingTime: z.number().describe('in minutes')
    })
  }),
  prompt: 'Generate a recipe for chocolate chip cookies'
});

// object is FULLY TYPED:
console.log(object.recipe.name); // TypeScript knows this is string
console.log(object.recipe.cookingTime); // TypeScript knows this is number
Enter fullscreen mode Exit fullscreen mode

Tool Calling (LLMs That Take Action)

import { openai } from '@ai-sdk/openai';
import { generateText, tool } from 'ai';
import { z } from 'zod';

const result = await generateText({
  model: openai('gpt-4o'),
  tools: {
    weather: tool({
      description: 'Get weather for a city',
      parameters: z.object({ city: z.string() }),
      execute: async ({ city }) => {
        const data = await fetchWeather(city);
        return `${city}: ${data.temp}°F, ${data.condition}`;
      }
    }),
    searchProducts: tool({
      description: 'Search product catalog',
      parameters: z.object({ query: z.string(), maxPrice: z.number().optional() }),
      execute: async ({ query, maxPrice }) => {
        return await db.products.search(query, { maxPrice });
      }
    })
  },
  prompt: 'What\'s the weather in Tokyo and find me umbrellas under $30'
});
Enter fullscreen mode Exit fullscreen mode

The LLM decides which tools to call and combines the results into a coherent answer.

Swap Providers in One Line

import { openai } from '@ai-sdk/openai';
import { anthropic } from '@ai-sdk/anthropic';
import { google } from '@ai-sdk/google';
import { ollama } from 'ollama-ai-provider';

// Same code, different models:
streamText({ model: openai('gpt-4o'), messages });
streamText({ model: anthropic('claude-sonnet-4-20250514'), messages });
streamText({ model: google('gemini-2.0-flash'), messages });
streamText({ model: ollama('llama3'), messages }); // Local!
Enter fullscreen mode Exit fullscreen mode

The Verdict

Vercel AI SDK is the fastest way to build AI-powered UIs. Streaming, tool calling, structured output, multi-provider support — all with TypeScript type safety. If you're building AI features in a web app, this is your starting point.


Need help building AI-powered data pipelines or web scrapers? I build custom solutions. Reach out: spinov001@gmail.com

Check out my awesome-web-scraping collection — 400+ tools for extracting web data.

Top comments (0)