DEV Community

brian austin
brian austin

Posted on

I built a $2/month Claude API proxy — here's the curl command

I built a $2/month Claude API proxy — here's the curl command

Last month I was spending $20/month on ChatGPT Plus and another $20/month on Claude Pro. For a side project developer, that's $40/month gone before I write a single line of code.

So I built a proxy. Here's how to use it in 60 seconds.

The curl command

curl https://api.simplylouie.com/v1/messages \
  -H "Authorization: Bearer YOUR_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "model": "claude-opus-4-5",
    "max_tokens": 1024,
    "messages": [
      {"role": "user", "content": "Hello, Claude!"}
    ]
  }'
Enter fullscreen mode Exit fullscreen mode

That's it. Claude's API, flat-rate, $2/month.

Why I built this

Anthropic's direct API pricing is per-token. Claude Opus costs $15 per million input tokens. If you're a developer running experiments, debugging sessions, or building side projects, the bill adds up fast.

Claude Code's ANTHROPIC_BASE_URL environment variable lets you point Claude Code at any compatible endpoint:

export ANTHROPIC_BASE_URL=https://api.simplylouie.com
export ANTHROPIC_API_KEY=your_simplylouie_key

# Now Claude Code uses the proxy
claude
Enter fullscreen mode Exit fullscreen mode

Same API. Same models. Flat rate.

What works

The proxy supports the full Anthropic Messages API:

import anthropic

client = anthropic.Anthropic(
    api_key="your_simplylouie_key",
    base_url="https://api.simplylouie.com"
)

message = client.messages.create(
    model="claude-opus-4-5",
    max_tokens=1024,
    messages=[
        {"role": "user", "content": "Explain async/await in Python"}
    ]
)

print(message.content[0].text)
Enter fullscreen mode Exit fullscreen mode
import Anthropic from '@anthropic-ai/sdk';

const client = new Anthropic({
  apiKey: 'your_simplylouie_key',
  baseURL: 'https://api.simplylouie.com'
});

const message = await client.messages.create({
  model: 'claude-opus-4-5',
  max_tokens: 1024,
  messages: [{ role: 'user', content: 'Explain async/await in JavaScript' }]
});

console.log(message.content[0].text);
Enter fullscreen mode Exit fullscreen mode

Claude Code integration

This is the main use case. Claude Code reads ANTHROPIC_BASE_URL automatically:

# Add to ~/.zshrc or ~/.bashrc
export ANTHROPIC_BASE_URL=https://api.simplylouie.com
export ANTHROPIC_API_KEY=sk-sl-your-key-here
Enter fullscreen mode Exit fullscreen mode

Now every claude session uses the proxy. No per-token billing. No surprise charges.

# Test it works
claude --version
claude "what model are you?"
Enter fullscreen mode Exit fullscreen mode

The pricing math

For a developer running Claude Code daily:

Usage pattern Direct API cost SimplyLouie cost
Light (30 min/day) ~$8/month $2/month
Medium (2 hrs/day) ~$25/month $2/month
Heavy (4+ hrs/day) ~$60/month $2/month
Code review automation ~$40/month $2/month

The more you use it, the bigger the saving.

What about Codex?

OpenAI just announced Codex is moving to API pricing. If you were using Codex for flat-rate coding assistance, you're now looking at per-token billing.

The ANTHROPIC_BASE_URL approach with a proxy gives you the same flat-rate model, but with Claude's models instead.

Global pricing

The $2/month price is $2 everywhere. But if you're in:

  • India: that's ₹165/month (vs ₹1,600+ for ChatGPT)
  • Nigeria: that's ₦3,200/month (vs ₦32,000+ for ChatGPT)
  • Philippines: that's ₱112/month (vs ₱1,120+ for ChatGPT)
  • Indonesia: that's Rp32,000/month (vs Rp320,000+ for ChatGPT)

Same API. Same models. Priced for where you actually live.

Try it

7-day free trial, no commitment. If you're running Claude Code or building with the Anthropic API, it's worth 5 minutes to test.

simplylouie.com/developers

The curl command above works with your trial key immediately after signup.


Built this because I was tired of the per-token math. 50% of revenue goes to animal rescue. Yes, really.

Top comments (0)