DEV Community

dtzp555-max
dtzp555-max

Posted on

Use your Claude Pro/Max subscription to power OpenClaw, OpenCode, Cline, and any OpenAI-compatible IDE — $0 API cost

You're already paying for Claude. Why pay again for the API?

If you have a Claude Pro ($20/mo) or Max ($100-200/mo) subscription, you already have access to Opus, Sonnet, and Haiku. But most AI-powered IDEs and agent frameworks want an API key — which means paying a second time at $15/MTok for Opus.

OCP (OpenClaw Claude Proxy) solves this. It exposes your Claude subscription as a standard OpenAI-compatible API on localhost. Any tool that speaks the OpenAI protocol can use it:

  • OpenClaw — multi-agent orchestration
  • OpenCode — AI coding assistant
  • Cline — VS Code AI extension
  • Continue.dev — open-source AI assistant
  • Aider — AI pair programming
  • Any tool that accepts an OpenAI base URL

One subscription. Multiple instances. All IDEs. $0 extra.

How It Works

Your IDE / Agent → OCP (localhost:3456) → claude CLI → Anthropic (via your subscription)
Enter fullscreen mode Exit fullscreen mode

OCP translates OpenAI-compatible /v1/chat/completions requests into claude -p CLI calls. Anthropic sees normal Claude Code usage under your existing subscription. No separate API key needed.

# Point any OpenAI-compatible tool at OCP
export OPENAI_BASE_URL=http://127.0.0.1:3456/v1
export OPENAI_API_KEY=dummy  # OCP doesn't need a real key
Enter fullscreen mode Exit fullscreen mode

Setup in 60 Seconds

# Prerequisites: Node.js 18+, Claude CLI installed & authenticated
git clone https://github.com/dtzp555-max/openclaw-claude-proxy
cd openclaw-claude-proxy
node setup.mjs    # auto-configures + installs as system service
Enter fullscreen mode Exit fullscreen mode

That's it. The proxy runs as a daemon (launchd on macOS, systemd on Linux), auto-starts on boot, and auto-recovers from crashes. Your IDE doesn't know the difference — it just sees an OpenAI API.

Monitor Your Usage in Real Time

With a Max subscription shared across multiple IDEs and agents, you need to know where you stand. The built-in ocp CLI gives you instant visibility:

$ ocp usage
Plan Usage Limits
─────────────────────────────────────
  Current session       3% used
                      Resets in 4h 32m  (Tue, Mar 24, 10:00 PM)

  Weekly (all models)   3% used
                      Resets in 6d 6h  (Tue, Mar 31, 12:00 AM)

  Extra usage         off

Model Stats
Model          Req   OK  Er  AvgT  MaxT  AvgP  MaxP
──────────────────────────────────────────────────────
haiku            1    1   0    6s    6s     0K    0K
opus             2    2   0   20s   26s   42K   43K
sonnet           2    2   0   24s   24s   41K   41K
Total            5

Proxy: up 0h 37m | 5 reqs | 0 err | 0 timeout
Enter fullscreen mode Exit fullscreen mode

See your session utilization, weekly limits, per-model request stats, response times, and prompt sizes — all at a glance.

If you use OpenClaw with Telegram or Discord, these commands also work as native slash commands: /ocp usage, /ocp status, /ocp settings.

Tune Settings Without Restart

Running Opus with large prompts? Bump the timeout. Too many concurrent agents? Cap the limit. All at runtime:

$ ocp settings maxPromptChars 200000
✓ maxPromptChars = 200000

$ ocp settings tiers.opus.base 180000
✓ tiers.opus.base = 180000
Enter fullscreen mode Exit fullscreen mode

Tunable parameters: timeout, firstByteTimeout, maxConcurrent, sessionTTL, maxPromptChars, and per-model timeout tiers.

Available Models

Model ID Maps to Notes
claude-opus-4-6 Claude Opus 4.6 Most capable
claude-sonnet-4-6 Claude Sonnet 4.6 Best speed/quality balance
claude-haiku-4 Claude Haiku 4.5 Fastest, lightweight tasks

All OCP Commands

ocp usage              Session/weekly limits & model stats
ocp status             Quick health overview
ocp health             Full proxy diagnostics
ocp settings           View all tunable parameters
ocp settings <k> <v>   Change a setting at runtime
ocp logs [N] [level]   Recent proxy logs
ocp models             List available models
ocp sessions           Active CLI sessions
ocp clear              Clear all sessions
ocp restart            Restart proxy or gateway
Enter fullscreen mode Exit fullscreen mode

Multi-Instance Sharing

The real power: one proxy, multiple consumers. I run 6 OpenClaw agents + Claude Code interactive sessions, all hitting the same OCP instance on localhost:3456. The proxy handles concurrency (default: 8 parallel requests), session management, and adaptive timeouts per model.

OpenClaw Agent 1 (Sonnet) ──┐
OpenClaw Agent 2 (Opus)  ───┤
OpenCode (Sonnet)        ───┼──→ OCP :3456 ──→ claude CLI ──→ Your subscription
Cline (Haiku)            ───┤
Aider (Sonnet)           ───┘
Enter fullscreen mode Exit fullscreen mode

All sharing the same Max subscription. Check ocp usage anytime to see how much headroom you have.

Built-in Safety

  • Prompt truncation — auto-caps prompts at 150K chars to prevent runaway context from eating your quota
  • Adaptive timeouts — Opus gets 150s, Sonnet 120s, Haiku 45s base first-byte timeout, scaling with prompt size
  • Localhost only — binds to 127.0.0.1, not exposed to the network
  • Optional Bearer auth — set PROXY_API_KEY if you want token protection

Try It

git clone https://github.com/dtzp555-max/openclaw-claude-proxy
cd openclaw-claude-proxy
node setup.mjs
ln -sf $(pwd)/ocp /usr/local/bin/ocp
ocp usage   # see your subscription status
Enter fullscreen mode Exit fullscreen mode

GitHub: dtzp555-max/openclaw-claude-proxy

Requirements: Node.js 18+, Claude CLI authenticated (claude login)


How much are you spending on AI API keys across different tools? I'm curious if others are looking for ways to consolidate.

Top comments (0)