I built a $2/month Claude API proxy — here's the curl command
I got tired of paying $20/month for Claude Pro just to use Claude Code.
So I built a flat-rate Claude API proxy. Here's everything you need to use it.
The one-line setup
export ANTHROPIC_BASE_URL=https://api.simplylouie.com
export ANTHROPIC_API_KEY=your_key_here
That's it. Every tool that uses ANTHROPIC_BASE_URL now routes through the proxy.
Test it with curl
curl https://api.simplylouie.com/v1/messages \
-H "x-api-key: $ANTHROPIC_API_KEY" \
-H "anthropic-version: 2023-06-01" \
-H "content-type: application/json" \
-d '{
"model": "claude-opus-4-5",
"max_tokens": 1024,
"messages": [{"role": "user", "content": "Hello"}]
}'
You get a real Anthropic-format response. Your existing tools don't know the difference.
Use it with Claude Code
# In your shell profile
export ANTHROPIC_BASE_URL=https://api.simplylouie.com
export ANTHROPIC_API_KEY=sl_your_key_here
# Now run Claude Code normally
claude
Claude Code reads ANTHROPIC_BASE_URL automatically. No config files needed.
Use it with Python
import anthropic
client = anthropic.Anthropic(
api_key="sl_your_key_here",
base_url="https://api.simplylouie.com"
)
message = client.messages.create(
model="claude-opus-4-5",
max_tokens=1024,
messages=[{"role": "user", "content": "Hello, Claude"}]
)
print(message.content)
Use it with the JS/TS SDK
import Anthropic from '@anthropic-ai/sdk';
const client = new Anthropic({
apiKey: 'sl_your_key_here',
baseURL: 'https://api.simplylouie.com',
});
const message = await client.messages.create({
model: 'claude-opus-4-5',
max_tokens: 1024,
messages: [{ role: 'user', content: 'Hello' }],
});
console.log(message.content);
Use it with LangChain
from langchain_anthropic import ChatAnthropic
llm = ChatAnthropic(
model="claude-opus-4-5",
anthropic_api_key="sl_your_key_here",
anthropic_api_url="https://api.simplylouie.com"
)
response = llm.invoke("Explain async/await in one paragraph")
print(response.content)
Use it with LlamaIndex
from llama_index.llms.anthropic import Anthropic
llm = Anthropic(
model="claude-opus-4-5",
api_key="sl_your_key_here",
base_url="https://api.simplylouie.com"
)
response = llm.complete("Write a haiku about programming")
print(response.text)
What models are available?
All current Claude models:
-
claude-opus-4-5— most capable -
claude-sonnet-4-5— balanced speed/quality -
claude-haiku-3-5— fastest, cheapest
How much does it cost?
$2/month flat rate.
No per-token billing. No usage meters. No surprise invoices.
For comparison:
- Anthropic direct API: ~$15/million input tokens (Opus)
- Claude Pro subscription: $20/month (only for claude.ai, not API)
- SimplyLouie: $2/month for API access
If you're building something small, running experiments, or using Claude Code daily — flat rate saves you money.
The 7-day free trial
Get your API key at simplylouie.com/developers
Card required to start (standard SaaS), not charged for 7 days.
Why I built this
I was spending $20/month on Claude Pro just to get API access for Claude Code. Most of that quota sat unused.
Flat-rate makes more sense for developers who use Claude in bursts — heavy one week, light the next.
50% of revenue goes to animal rescue. That part wasn't required, it just felt right.
Drop your questions below — happy to help you get set up.
Top comments (0)