TL;DR
OpenRouter provides a unified API to access 200+ AI models — GPT-4, Claude, Gemini, Llama, Mistral, and more — through one OpenAI-compatible endpoint. Some models are free, and you can switch providers without changing code.
What Is OpenRouter?
OpenRouter is an AI model router:
- 200+ models — GPT-4o, Claude 3.5, Gemini, Llama 3, Mistral, etc.
- One API — OpenAI-compatible format
- Free models — several open-source models available for free
- Automatic fallbacks — if one provider is down, routes to another
- Usage tracking — monitor spend across all models
- Pay-per-use — no subscriptions, pay only for what you use
Quick Start
npm install openai # OpenRouter uses OpenAI-compatible API
import OpenAI from "openai";
const openrouter = new OpenAI({
baseURL: "https://openrouter.ai/api/v1",
apiKey: "sk-or-YOUR_KEY",
});
const completion = await openrouter.chat.completions.create({
model: "anthropic/claude-sonnet-4-20250514",
messages: [
{ role: "user", content: "Explain quantum computing in 3 sentences" },
],
});
console.log(completion.choices[0].message.content);
Switch Models Without Changing Code
// Just change the model string!
const models = [
"openai/gpt-4o",
"anthropic/claude-sonnet-4-20250514",
"google/gemini-pro-1.5",
"meta-llama/llama-3.1-405b-instruct",
"mistralai/mistral-large",
];
async function askAI(model: string, prompt: string) {
const response = await openrouter.chat.completions.create({
model,
messages: [{ role: "user", content: prompt }],
});
return response.choices[0].message.content;
}
// Compare outputs from different models
for (const model of models) {
console.log(`\n--- ${model} ---`);
console.log(await askAI(model, "Write a haiku about coding"));
}
Free Models
// These models are FREE on OpenRouter:
const freeModels = [
"meta-llama/llama-3.2-3b-instruct:free",
"mistralai/mistral-7b-instruct:free",
"google/gemma-2-9b-it:free",
"huggingfaceh4/zephyr-7b-beta:free",
];
// Use free models for testing, prototyping, or simple tasks
const response = await openrouter.chat.completions.create({
model: "meta-llama/llama-3.2-3b-instruct:free",
messages: [{ role: "user", content: "Hello!" }],
});
Streaming
const stream = await openrouter.chat.completions.create({
model: "anthropic/claude-sonnet-4-20250514",
messages: [{ role: "user", content: "Write a short story" }],
stream: true,
});
for await (const chunk of stream) {
process.stdout.write(chunk.choices[0]?.delta?.content || "");
}
Function Calling
const response = await openrouter.chat.completions.create({
model: "openai/gpt-4o",
messages: [{ role: "user", content: "What's the weather in Tokyo?" }],
tools: [
{
type: "function",
function: {
name: "get_weather",
description: "Get current weather for a location",
parameters: {
type: "object",
properties: {
location: { type: "string" },
},
required: ["location"],
},
},
},
],
});
Python
from openai import OpenAI
client = OpenAI(
base_url="https://openrouter.ai/api/v1",
api_key="sk-or-YOUR_KEY",
)
response = client.chat.completions.create(
model="anthropic/claude-sonnet-4-20250514",
messages=[{"role": "user", "content": "Hello!"}],
)
print(response.choices[0].message.content)
OpenRouter vs Direct APIs
| Feature | OpenRouter | OpenAI Direct | Anthropic Direct |
|---|---|---|---|
| Models | 200+ | OpenAI only | Claude only |
| One API | Yes | Own format | Own format |
| Free models | Yes | No | No |
| Fallbacks | Automatic | Manual | Manual |
| Usage dashboard | All models | OpenAI only | Anthropic only |
| Pricing | Pass-through + small fee | Direct | Direct |
Resources
Building AI apps with web data? My Apify scraping tools extract data from any website — feed it to any AI model through OpenRouter. Questions? Email spinov001@gmail.com
Top comments (0)