DEV Community

Atlas Whoff
Atlas Whoff

Posted on

LangChain vs Vercel AI SDK vs Raw API: Choosing Your AI Stack

LangChain vs Vercel AI SDK vs Raw API: Choosing Your AI Stack

Three options dominate AI app development. Each solves a different problem. Here's how to choose — and when the simplest option wins.

Raw API (Anthropic/OpenAI SDK)

Direct API calls with the official SDK. No abstraction layer.

import Anthropic from '@anthropic-ai/sdk';

const client = new Anthropic();

const message = await client.messages.create({
  model: 'claude-opus-4-6',
  max_tokens: 1024,
  messages: [{ role: 'user', content: 'Explain RSC in one paragraph' }],
});

console.log(message.content[0].text);
Enter fullscreen mode Exit fullscreen mode

Use when: Simple completions, full control, learning AI development, or when abstractions add more complexity than they remove.

Vercel AI SDK

Optimized for React/Next.js. Handles streaming, UI state, tool calls.

import { streamText } from 'ai';
import { anthropic } from '@ai-sdk/anthropic';

// Server Action or API route
export async function POST(req: Request) {
  const { messages } = await req.json();

  const result = await streamText({
    model: anthropic('claude-opus-4-6'),
    messages,
    system: 'You are a helpful assistant.',
  });

  return result.toDataStreamResponse();
}
Enter fullscreen mode Exit fullscreen mode
// Client component — useChat handles streaming automatically
'use client';
import { useChat } from 'ai/react';

export function Chat() {
  const { messages, input, handleInputChange, handleSubmit } = useChat();

  return (
    <div>
      {messages.map(m => <p key={m.id}>{m.content}</p>)}
      <form onSubmit={handleSubmit}>
        <input value={input} onChange={handleInputChange} />
        <button type='submit'>Send</button>
      </form>
    </div>
  );
}
Enter fullscreen mode Exit fullscreen mode

Use when: Building chat interfaces, need streaming in React, multi-provider support (swap Claude ↔ GPT ↔ Gemini without code changes).

LangChain

Chains, agents, RAG pipelines, memory management, vector store integrations.

import { ChatAnthropic } from '@langchain/anthropic';
import { ConversationChain } from 'langchain/chains';
import { BufferMemory } from 'langchain/memory';

const model = new ChatAnthropic({ model: 'claude-opus-4-6' });
const memory = new BufferMemory();

const chain = new ConversationChain({ llm: model, memory });

const response = await chain.call({ input: 'What is my name?' });
Enter fullscreen mode Exit fullscreen mode

Use when: RAG (retrieval-augmented generation), complex multi-step agents, vector database integration, document processing pipelines.

Decision Guide

Use Case Pick
Simple completion or single prompt Raw SDK
Chat UI with streaming Vercel AI SDK
Multi-provider flexibility Vercel AI SDK
RAG over documents LangChain
Complex agent workflows LangChain
Production cost optimization Raw SDK + prompt caching

Tool Calling (All Three)

// Vercel AI SDK tool calling
import { tool } from 'ai';
import { z } from 'zod';

const result = await streamText({
  model: anthropic('claude-opus-4-6'),
  tools: {
    getWeather: tool({
      description: 'Get weather for a city',
      parameters: z.object({ city: z.string() }),
      execute: async ({ city }) => fetchWeather(city),
    }),
  },
  prompt: 'What is the weather in Tokyo?',
});
Enter fullscreen mode Exit fullscreen mode

Claude API routes with Vercel AI SDK ship pre-configured in the AI SaaS Starter Kit — streaming chat, tool calling, prompt caching included. $99 at whoffagents.com.

Top comments (0)