DEV Community

brian austin
brian austin

Posted on

I built a $2/month Claude API wrapper — here's the curl command

I built a $2/month Claude API wrapper — here's the curl command

Most developers I know are paying $20/month for ChatGPT or $20/month for Claude Pro. They use maybe 10% of what they're paying for.

I built SimplyLouie — a flat-rate Claude API wrapper at $2/month. No token counting. No surprise bills. Same Claude claude-3-5-sonnet model underneath.

Here's everything you need to integrate it.

The curl command

curl -X POST https://simplylouie.com/api/chat \
  -H "Content-Type: application/json" \
  -H "X-API-Key: YOUR_API_KEY" \
  -d '{
    "message": "Explain async/await in JavaScript in 2 sentences"
  }'
Enter fullscreen mode Exit fullscreen mode

Response:

{
  "response": "Async/await is syntactic sugar over Promises that lets you write asynchronous code that reads like synchronous code. The `await` keyword pauses execution until a Promise resolves, while `async` marks a function as returning a Promise.",
  "model": "claude-3-5-sonnet"
}
Enter fullscreen mode Exit fullscreen mode

That's it. No SDK. No authentication flow. No token accounting.

Node.js integration

const fetch = require('node-fetch'); // or use native fetch in Node 18+

async function ask(message) {
  const response = await fetch('https://simplylouie.com/api/chat', {
    method: 'POST',
    headers: {
      'Content-Type': 'application/json',
      'X-API-Key': process.env.SIMPLYLOUIE_API_KEY
    },
    body: JSON.stringify({ message })
  });

  const data = await response.json();
  return data.response;
}

// Usage
const answer = await ask('What is the time complexity of quicksort?');
console.log(answer);
Enter fullscreen mode Exit fullscreen mode

With conversation history

const fetch = require('node-fetch');

class SimpleChat {
  constructor(apiKey) {
    this.apiKey = apiKey;
    this.history = [];
  }

  async send(message) {
    this.history.push({ role: 'user', content: message });

    const response = await fetch('https://simplylouie.com/api/chat', {
      method: 'POST',
      headers: {
        'Content-Type': 'application/json',
        'X-API-Key': this.apiKey
      },
      body: JSON.stringify({
        message,
        history: this.history.slice(-10) // last 10 turns
      })
    });

    const data = await response.json();
    this.history.push({ role: 'assistant', content: data.response });
    return data.response;
  }

  reset() {
    this.history = [];
  }
}

// Usage
const chat = new SimpleChat(process.env.SIMPLYLOUIE_API_KEY);
await chat.send('My name is Alex');
const reply = await chat.send('What is my name?'); // Remembers context
console.log(reply); // 'Your name is Alex'
Enter fullscreen mode Exit fullscreen mode

Python integration

import requests
import os

def ask(message: str) -> str:
    response = requests.post(
        'https://simplylouie.com/api/chat',
        headers={
            'Content-Type': 'application/json',
            'X-API-Key': os.environ['SIMPLYLOUIE_API_KEY']
        },
        json={'message': message}
    )
    return response.json()['response']

# Usage
print(ask('Write a Python function to reverse a linked list'))
Enter fullscreen mode Exit fullscreen mode

Streaming responses (SSE)

For real-time streaming output:

const EventSource = require('eventsource');

function streamAsk(message, apiKey) {
  const url = `https://simplylouie.com/api/chat/stream?message=${encodeURIComponent(message)}`;

  const es = new EventSource(url, {
    headers: { 'X-API-Key': apiKey }
  });

  es.onmessage = (event) => {
    if (event.data === '[DONE]') {
      es.close();
      process.stdout.write('\n');
      return;
    }
    process.stdout.write(event.data);
  };

  es.onerror = () => es.close();
}

streamAsk('Write a haiku about JavaScript promises', process.env.SIMPLYLOUIE_API_KEY);
Enter fullscreen mode Exit fullscreen mode

Error handling

async function askSafe(message, apiKey) {
  try {
    const response = await fetch('https://simplylouie.com/api/chat', {
      method: 'POST',
      headers: {
        'Content-Type': 'application/json',
        'X-API-Key': apiKey
      },
      body: JSON.stringify({ message }),
      signal: AbortSignal.timeout(30000) // 30s timeout
    });

    if (!response.ok) {
      const error = await response.json();
      throw new Error(`API error ${response.status}: ${error.message}`);
    }

    return await response.json();
  } catch (err) {
    if (err.name === 'TimeoutError') {
      throw new Error('Request timed out after 30s');
    }
    throw err;
  }
}
Enter fullscreen mode Exit fullscreen mode

Why flat-rate instead of pay-per-token?

Anthropic charges ~$3 per million input tokens for claude-3-5-sonnet. A typical developer workflow with 50 queries/day at ~500 tokens each = 25,000 tokens/day = 750,000 tokens/month = $2.25 just in input tokens. Add output tokens and you're at $5-8/month before you even notice.

Flat-rate removes the cognitive overhead of token budgeting. You stop asking "is this query worth making?" and start actually building.

Pricing comparison

Service Price Token limits
ChatGPT Plus $20/month Yes
Claude Pro $20/month Yes
Anthropic API ~$5-10/month (typical dev) Usage-based
SimplyLouie $2/month No

Get your API key

  1. Go to simplylouie.com/developers
  2. Sign up (7-day free trial, then $2/month)
  3. Find your API key in your dashboard

No SDK to install. Works with any language that can make HTTP requests.


Have you tried building with flat-rate APIs? What would you build if token costs weren't a concern? Drop a comment below.

Top comments (0)