DEV Community

Serhii Kalyna
Serhii Kalyna

Posted on • Originally published at kalyna.pro

How to Use Claude API with Node.js (Complete Guide, 2026)

The official Anthropic SDK supports Node.js and TypeScript out of the box. This guide covers everything from installation to streaming, tool use, and production best practices.


Installation

npm install @anthropic-ai/sdk
export ANTHROPIC_API_KEY="your-key-here"
Enter fullscreen mode Exit fullscreen mode

First Message

import Anthropic from "@anthropic-ai/sdk";

const client = new Anthropic();

const response = await client.messages.create({
  model: "claude-sonnet-4-6",
  max_tokens: 1024,
  messages: [{ role: "user", content: "Explain async/await in JavaScript in 3 sentences." }],
});

console.log(response.content[0].text);
Enter fullscreen mode Exit fullscreen mode

CommonJS:

const Anthropic = require("@anthropic-ai/sdk");
const client = new Anthropic({ apiKey: process.env.ANTHROPIC_API_KEY });
Enter fullscreen mode Exit fullscreen mode

Streaming

For long responses, stream tokens as they arrive:

const stream = client.messages.stream({
  model: "claude-sonnet-4-6",
  max_tokens: 1024,
  messages: [{ role: "user", content: "Write a short story about a developer." }],
});

for await (const event of stream) {
  if (event.type === "content_block_delta" && event.delta.type === "text_delta") {
    process.stdout.write(event.delta.text);
  }
}

const finalMessage = await stream.finalMessage();
console.log("\nTotal tokens:", finalMessage.usage);
Enter fullscreen mode Exit fullscreen mode

Conversation History

import Anthropic from "@anthropic-ai/sdk";

const client = new Anthropic();
const history: Anthropic.MessageParam[] = [];

async function chat(userMessage: string): Promise<string> {
  history.push({ role: "user", content: userMessage });

  const response = await client.messages.create({
    model: "claude-sonnet-4-6",
    max_tokens: 1024,
    system: "You are a helpful coding assistant.",
    messages: history,
  });

  const reply = response.content[0].text;
  history.push({ role: "assistant", content: reply });
  return reply;
}

console.log(await chat("What is a closure?"));
console.log(await chat("Show me an example."));
Enter fullscreen mode Exit fullscreen mode

Tool Use (Function Calling)

import Anthropic from "@anthropic-ai/sdk";

const client = new Anthropic();

const tools: Anthropic.Tool[] = [
  {
    name: "get_weather",
    description: "Get the current weather for a city",
    input_schema: {
      type: "object",
      properties: {
        city: { type: "string", description: "City name" },
      },
      required: ["city"],
    },
  },
];

async function getWeather(city: string): Promise<string> {
  const res = await fetch(`https://wttr.in/${city}?format=3`);
  return res.text();
}

async function runWithTools(question: string): Promise<string> {
  const messages: Anthropic.MessageParam[] = [{ role: "user", content: question }];

  while (true) {
    const response = await client.messages.create({
      model: "claude-sonnet-4-6",
      max_tokens: 1024,
      tools,
      messages,
    });

    if (response.stop_reason === "end_turn") {
      const textBlock = response.content.find((b) => b.type === "text");
      return textBlock?.text ?? "";
    }

    if (response.stop_reason === "tool_use") {
      messages.push({ role: "assistant", content: response.content });

      const toolResults: Anthropic.ToolResultBlockParam[] = [];
      for (const block of response.content) {
        if (block.type === "tool_use") {
          const input = block.input as { city: string };
          const result = await getWeather(input.city);
          toolResults.push({ type: "tool_result", tool_use_id: block.id, content: result });
        }
      }
      messages.push({ role: "user", content: toolResults });
    }
  }
}

console.log(await runWithTools("What's the weather in Kyiv?"));
Enter fullscreen mode Exit fullscreen mode

Prompt Caching

Cache long system prompts to cut costs up to 90%:

const response = await client.messages.create({
  model: "claude-sonnet-4-6",
  max_tokens: 1024,
  system: [
    {
      type: "text",
      text: "You are an expert TypeScript developer with deep knowledge of Node.js...",
      cache_control: { type: "ephemeral" },
    },
  ],
  messages: [{ role: "user", content: "How do I handle backpressure in streams?" }],
});

console.log("Cache read tokens:", response.usage.cache_read_input_tokens);
Enter fullscreen mode Exit fullscreen mode

Error Handling

import { APIError, RateLimitError, APIConnectionError } from "@anthropic-ai/sdk";

async function safeCreate(content: string): Promise<string | null> {
  try {
    const response = await client.messages.create({
      model: "claude-sonnet-4-6",
      max_tokens: 512,
      messages: [{ role: "user", content }],
    });
    return response.content[0].text;
  } catch (err) {
    if (err instanceof RateLimitError) {
      console.error("Rate limited — back off and retry");
    } else if (err instanceof APIConnectionError) {
      console.error("Network error:", err.message);
    } else if (err instanceof APIError) {
      console.error(`API error ${err.status}:`, err.message);
    }
    return null;
  }
}
Enter fullscreen mode Exit fullscreen mode

Express.js Integration

import express from "express";
import Anthropic from "@anthropic-ai/sdk";

const app = express();
app.use(express.json());
const client = new Anthropic();

app.post("/chat", async (req, res) => {
  const { message } = req.body;
  if (!message) return res.status(400).json({ error: "message required" });

  const response = await client.messages.create({
    model: "claude-sonnet-4-6",
    max_tokens: 1024,
    messages: [{ role: "user", content: message }],
  });

  res.json({ reply: response.content[0].text });
});

app.listen(3000, () => console.log("Server running on :3000"));
Enter fullscreen mode Exit fullscreen mode

Originally published at kalyna.pro

Top comments (0)