TL;DR
If you're already paying $20-100/month for Claude Pro or Max, openclaw-claude-proxy (ocp) lets you use that subscription through any OpenAI-compatible app — no per-token billing, no API keys.
- 👉 GitHub: dtzp555-max/openclaw-claude-proxy
- ⭐ Supports: Claude Opus 4.6, Sonnet 4.6, Haiku 4.5
- 🔌 Compatible with: LangChain, Open WebUI, LlamaIndex, Cursor, Continue.dev, and anything speaking OpenAI
The Problem
You have Claude Max ($100/mo). You want to use Opus 4.6 in your local agent setup.
Your options used to be:
- Use the official API → pay $15/M tokens on top of your subscription
- Manually copy-paste into claude.ai → not practical
What ocp Does
Your app (OpenAI SDK / LangChain / curl)
↓ POST /v1/chat/completions
ocp running at localhost:3456
↓ CLI worker pool
claude CLI (your OAuth session / subscription)
↓
Anthropic API (billed to your subscription, not per-token)
Cost Comparison
| Usage | Direct API | ocp (Pro $20/mo) | ocp (Max $100/mo) |
|---|---|---|---|
| 100 Opus requests (2K tokens each) | ~$3/run | $0 extra | $0 extra |
| Monthly heavy usage (500 requests) | $15+ | included | included |
| Multiple agents sharing | multiply | one subscription | one subscription |
Setup (2 commands)
git clone https://github.com/dtzp555-max/openclaw-claude-proxy
cd openclaw-claude-proxy
node setup.mjs
That's it. The proxy:
- Starts on port 3456
- Auto-installs as a launchd service (macOS) or systemd service (Linux)
- Restarts automatically if it crashes
Works with Any OpenAI Client
# LangChain
from langchain_openai import ChatOpenAI
llm = ChatOpenAI(
base_url="http://localhost:3456/v1",
api_key="not-needed",
model="claude-opus-4-6"
)
# curl
curl http://localhost:3456/v1/chat/completions \
-H 'Content-Type: application/json' \
-d '{"model":"claude-opus-4-6","messages":[{"role":"user","content":"Hello"}]}'
Honest Trade-offs
⚠️ Serial execution: The CLI worker pool processes one request at a time. This means higher latency than the direct API.
✅ Pre-warming: ocp pre-warms workers so the first real request doesn't pay cold-start cost.
✅ Best for: Personal tools, local agents, dev workflows where you're the only user.
❌ Not ideal for: High-concurrency production servers.
v1.7.x Changelog
- v1.7.1: Auto-recovery after network drops, sanitizes env to prevent auth conflicts
-
v1.7.0: Optional Bearer token auth via
PROXY_API_KEY
Bonus: Share One Subscription Across Agents
If you run multiple OpenClaw agents (different bots, different personas), they can all hit the same ocp instance and share your single Claude subscription.
GitHub: https://github.com/dtzp555-max/openclaw-claude-proxy
Feedback and PRs welcome!
Top comments (0)