DEV Community

brian austin
brian austin

Posted on

I built a $2/month Claude API — here's the curl command (and the code behind it)

I built a $2/month Claude API — here's the curl command (and the code behind it)

Every developer I know has the same conversation at some point:

"I want to add AI to my app, but Anthropic's API pricing is... complicated."

Token counts. Context windows. Prompt caching. Input vs output pricing. The math is genuinely hard to predict in production.

So I built something different: a fixed-price Claude API at $2/month, no token counting, no surprise bills.

Here's how it works — including the actual curl command.

The API

curl -X POST https://simplylouie.com/api/chat \
  -H "Content-Type: application/json" \
  -H "Authorization: Bearer YOUR_API_KEY" \
  -d '{"message": "Explain async/await in JavaScript"}'
Enter fullscreen mode Exit fullscreen mode

That's it. One endpoint. One response. $2/month flat, regardless of how many times you call it.

Why I built this

I've been using Claude directly via Anthropic's API. The quality is incredible — Claude Opus 4.7 is genuinely the best reasoning model I've used.

But the pricing model creates a specific problem for side projects and indie apps:

  • You don't know what a month will cost until it's over
  • A viral moment (good news!) can become a $500 API bill (bad news)
  • Emerging market users can't predict their costs in local currency

Fixed pricing solves all three.

The implementation

Under the hood, SimplyLouie's API is a Node.js proxy:

// Simplified version of what's running in production
const express = require('express');
const Anthropic = require('@anthropic-ai/sdk');

const app = express();
const client = new Anthropic();

app.post('/api/chat', authenticateUser, async (req, res) => {
  const { message } = req.body;

  const response = await client.messages.create({
    model: 'claude-opus-4-7',
    max_tokens: 1024,
    messages: [{ role: 'user', content: message }]
  });

  res.json({
    response: response.content[0].text,
    model: 'claude-opus-4-7'
  });
});
Enter fullscreen mode Exit fullscreen mode

The key insight: I pay Anthropic for a fixed pool of capacity, and pass that through to users at a flat rate. The economics work because most users don't use AI at the same peak moment.

Real-world usage examples

Content summarization

curl -X POST https://simplylouie.com/api/chat \
  -H "Authorization: Bearer YOUR_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{"message": "Summarize this article in 3 bullet points: [paste article here]"}'
Enter fullscreen mode Exit fullscreen mode

Code review

curl -X POST https://simplylouie.com/api/chat \
  -H "Authorization: Bearer YOUR_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{"message": "Review this Python function for bugs and suggest improvements: def calculate_total(items): return sum(item.price for item in items)"}'
Enter fullscreen mode Exit fullscreen mode

Translation (great for emerging market apps)

curl -X POST https://simplylouie.com/api/chat \
  -H "Authorization: Bearer YOUR_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{"message": "Translate to Hindi: Your payment of ₹165 has been processed successfully."}'
Enter fullscreen mode Exit fullscreen mode

Python client

import requests

def ask_louie(message, api_key):
    response = requests.post(
        'https://simplylouie.com/api/chat',
        headers={
            'Authorization': f'Bearer {api_key}',
            'Content-Type': 'application/json'
        },
        json={'message': message}
    )
    return response.json()['response']

# Usage
result = ask_louie(
    'Explain the PIX payment system in Brazil',
    'your-api-key-here'
)
print(result)
Enter fullscreen mode Exit fullscreen mode

JavaScript / Node.js client

async function askLouie(message, apiKey) {
  const response = await fetch('https://simplylouie.com/api/chat', {
    method: 'POST',
    headers: {
      'Authorization': `Bearer ${apiKey}`,
      'Content-Type': 'application/json'
    },
    body: JSON.stringify({ message })
  });

  const data = await response.json();
  return data.response;
}

// Usage
const result = await askLouie(
  'Generate a SPEI payment reference number format explanation',
  'your-api-key-here'
);
console.log(result);
Enter fullscreen mode Exit fullscreen mode

The pricing math

If you're building something that makes ~50 AI calls per month (a side project, a small tool, a demo):

Provider Cost/month Billing
Anthropic direct $5–$50+ Per token
OpenAI direct $5–$50+ Per token
SimplyLouie $2 flat Flat rate

For developers in emerging markets, this matters even more:

  • India: ₹165/month (vs ₹1,600+ for ChatGPT)
  • Nigeria: ₦3,200/month (vs ₦32,000+ for ChatGPT)
  • Philippines: ₱112/month (vs ₱1,120+ for ChatGPT)
  • Indonesia: Rp32,000/month (vs Rp320,000+ for ChatGPT)

Getting your API key

  1. Sign up at simplylouie.com/developers
  2. 7-day free trial, no charge until day 8
  3. $2/month after that — cancel anytime

The API key is in your dashboard after signup.

One more thing

50% of every $2 goes to animal rescue. The same AI that helps you ship faster also feeds dogs waiting for homes.

That's the whole product. One endpoint, flat pricing, good cause.


Questions? Drop them in the comments. I read every one.

Top comments (0)