Aider is one of the best AI pair programming tools, but by default it connects directly to OpenAI or Anthropic. What if you want to route through a custom API endpoint — for cost savings, failover, or unified billing?
Here's how to set it up.
The Problem
You're using Aider and want to:
- Access multiple model providers (Claude, GPT, Gemini) without juggling API keys
- Route through a gateway for reliability or cost optimization
- Use a self-hosted or third-party API endpoint
Solution: Environment Variables
Aider uses the OpenAI SDK under the hood, which means it respects OPENAI_API_BASE for custom endpoints.
Step 1: Set Environment Variables
export OPENAI_API_BASE=https://futurmix.ai/v1
export OPENAI_API_KEY=your-gateway-key
Step 2: Run Aider with Any Model
# Use Claude Sonnet 4.5
aider --model claude-sonnet-4-5-20250929
# Use GPT-5.4
aider --model gpt-5.4
# Use Gemini 2.5 Pro
aider --model gemini-2.5-pro
That's it. One key, one endpoint, any model.
Step 3: Make It Permanent
Add to your .bashrc or .zshrc:
# AI Gateway config
export OPENAI_API_BASE=https://futurmix.ai/v1
export OPENAI_API_KEY=your-gateway-key
Or create a .env file in your project:
OPENAI_API_BASE=https://futurmix.ai/v1
OPENAI_API_KEY=your-gateway-key
Using .aider.conf.yml
You can also configure this in Aider's config file:
# ~/.aider.conf.yml
openai-api-base: https://futurmix.ai/v1
openai-api-key: your-gateway-key
model: claude-sonnet-4-5-20250929
Why Use a Gateway?
| Benefit | Direct API | Through Gateway |
|---|---|---|
| API Keys needed | One per provider | One total |
| Model switching | Change key + SDK | Change model param |
| Failover | Manual | Automatic |
| Billing | Per-provider | Unified |
Troubleshooting
"Model not found" error:
Make sure you're using the full model ID (e.g., claude-sonnet-4-5-20250929, not claude-sonnet).
Authentication errors:
The gateway should accept Bearer tokens in the standard Authorization header. Check that OPENAI_API_KEY is set correctly.
Slow responses:
Gateway routing adds minimal latency (typically < 50ms). If you're seeing significant delays, check your network connection to the gateway endpoint.
Other Tools That Work the Same Way
This same OPENAI_API_BASE pattern works with:
- Cursor — Settings → Models → OpenAI API Base
-
Continue —
config.json→ provider base URL - Roo Code — OpenAI Compatible provider settings
-
Claude Code —
ANTHROPIC_BASE_URLenvironment variable
Any tool built on the OpenAI SDK supports custom base URLs.
Using a custom API endpoint with Aider? Let me know your setup in the comments.
Top comments (0)