DEV Community

Atlas Whoff
Atlas Whoff

Posted on

Building with Claude API: Streaming, Tool Use, and System Prompts

Building with Claude API: Streaming, Tool Use, and System Prompts

The Anthropic API is powerful and well-designed. Here's how to use its key features effectively.

Basic Setup

npm install @anthropic-ai/sdk
Enter fullscreen mode Exit fullscreen mode
import Anthropic from '@anthropic-ai/sdk';

const client = new Anthropic({
  apiKey: process.env.ANTHROPIC_API_KEY,
});
Enter fullscreen mode Exit fullscreen mode

Simple Message

const response = await client.messages.create({
  model: 'claude-sonnet-4-6',
  max_tokens: 1024,
  messages: [{
    role: 'user',
    content: 'Explain async/await in one paragraph',
  }],
});

const text = response.content[0].type === 'text' ? response.content[0].text : '';
Enter fullscreen mode Exit fullscreen mode

System Prompts

const response = await client.messages.create({
  model: 'claude-sonnet-4-6',
  max_tokens: 2048,
  system: `You are a senior TypeScript engineer reviewing code for a SaaS product.
Focus on: correctness, security vulnerabilities, and performance.
Return JSON: [{ severity: 'error'|'warning'|'info', message: string, line: number }]`,
  messages: [{ role: 'user', content: codeToReview }],
});
Enter fullscreen mode Exit fullscreen mode

Streaming Responses

// Stream to avoid waiting for long completions
const stream = client.messages.stream({
  model: 'claude-sonnet-4-6',
  max_tokens: 4096,
  messages: [{ role: 'user', content: prompt }],
});

// Server-Sent Events to browser
for await (const event of stream) {
  if (event.type === 'content_block_delta' && event.delta.type === 'text_delta') {
    res.write(`data: ${JSON.stringify({ text: event.delta.text })}\n\n`);
  }
}

const finalMessage = await stream.getFinalMessage();
Enter fullscreen mode Exit fullscreen mode

Tool Use

const response = await client.messages.create({
  model: 'claude-sonnet-4-6',
  max_tokens: 4096,
  tools: [{
    name: 'get_weather',
    description: 'Get current weather for a city',
    input_schema: {
      type: 'object' as const,
      properties: {
        city: { type: 'string', description: 'City name' },
      },
      required: ['city'],
    },
  }],
  messages: [{ role: 'user', content: "What's the weather in Tokyo?" }],
});

if (response.stop_reason === 'tool_use') {
  const toolUse = response.content.find(b => b.type === 'tool_use')!;
  if (toolUse.type === 'tool_use') {
    const weatherData = await getWeather(toolUse.input as { city: string });

    // Send tool result back
    const final = await client.messages.create({
      model: 'claude-sonnet-4-6',
      max_tokens: 1024,
      messages: [
        { role: 'user', content: "What's the weather in Tokyo?" },
        { role: 'assistant', content: response.content },
        { role: 'user', content: [{
          type: 'tool_result',
          tool_use_id: toolUse.id,
          content: JSON.stringify(weatherData),
        }]},
      ],
    });
  }
}
Enter fullscreen mode Exit fullscreen mode

Model Selection

Model Speed Cost Best For
claude-haiku-4-5-20251001 Fastest Cheapest Classification, routing
claude-sonnet-4-6 Fast Mid Most tasks
claude-opus-4-6 Slower Most Complex reasoning

Start with Sonnet. Move to Haiku for high-volume simple tasks, Opus for complex reasoning that needs it.

Claude API integration with streaming, tool use, and agentic patterns are what powers the Workflow Automator MCP — real automation workflows built on the same foundation.

Top comments (0)