I built a CLI that lets you chat with multiple LLM models (GPT, Claude, Gemini) in one terminal.
What if you could host a roundtable meeting with multiple AI models — all in your terminal?
That's exactly what krew does. It lets you run GPT, Claude, Gemini (and any OpenAI-compatible provider) in a single terminal session, with shared context so agents can see and build on each other's answers.
Quick Start
Get started in 3 commands:
npm install -g @zhing2026/krew
krew config init
krew
The config wizard walks you through setting up providers and agents — no manual config file editing needed.
@ Addressing — Talk to Any Agent
Use @ to control who you're talking to:
› @all What's the best way to handle errors in Rust?
(all agents respond in order)
› @opus Can you elaborate on the Result type?
(only Claude responds)
› Tell me more
(continues with the last respondent)
#Whisper — Private Messages
Want to ask one agent to privately evaluate another's answer?
› @all Propose an architecture for a chat app
(both agents answer publicly)
› #opus What are the weaknesses in GPT's proposal?
(only opus sees this — other agents see a placeholder)
You can even create private whisper groups:
› #opus #gemini Discuss the tradeoffs between these approaches
(only opus and gemini see each other's replies)
AI-to-AI Routing
When an agent's reply @mentions another agent, that agent is automatically dispatched. You can sit back and watch them collaborate (or argue).
Built-in Tools
Agents aren't just chatbots — they can take action:
| Tool | Description |
|---|---|
read_file |
Read file content |
write_file |
Create or overwrite files |
edit_file |
Search-and-replace editing |
shell |
Execute shell commands |
glob |
File pattern matching |
grep |
Content search with regex |
fetch_url |
Fetch and parse web pages |
activate_skill |
Load specialized skill instructions |
All file operations are sandboxed to your project directory.
MCP Integration
Extend agent capabilities via Model Context Protocol servers — both stdio and HTTP transports are supported:
[[mcp_servers]]
name = "filesystem"
command = "npx"
args = ["-y", "@modelcontextprotocol/server-filesystem", "."]
Skill System
Define reusable skill packages with a SKILL.md file. Agents automatically discover available skills and activate them when needed:
my-skill/
├── SKILL.md # Skill definition (name, description, instructions)
├── scripts/ # Helper scripts
└── references/ # Reference materials
Custom Slash Commands
Create your own commands as Markdown files with argument substitution and bash preprocessing:
---
description: Review code for issues
argument-hint: <file_path>
---
Please review the following file: $ARGUMENTS
Here are the recent changes:
!`git diff --cached`
Save as .krew/commands/review.md, then use /review src/main.rs in your session.
Session Management
- Persistence — Every message is saved in real-time. Crash? No problem.
-
Resume — Pick up any previous conversation with
/resume -
Rewind — Fork from any point in history with
/rewind - Auto-compact — Automatic context compression when token limits approach
Prompt Mode for CI/CD
Run one-shot prompts from scripts:
# Code review in CI
git diff HEAD~1 | krew -p "@opus review these changes for bugs"
# JSON output for parsing
krew -p "@all hello" --format json
More Features
- Streaming output — Markdown rendering with syntax highlighting and per-agent color coding
- Thinking/reasoning — Display model thinking process (configurable: low/medium/high)
- Web search — Provider-native web search (OpenAI, Anthropic, Gemini)
- Per-agent sampling — Configure temperature, top_p, max_tokens per agent
-
Project instructions —
AGENTS.mdfiles auto-injected into system prompts -
Config management —
krew config init/add/del/list/doctorfor full config CRUD
Supported Providers
| Provider | Examples |
|---|---|
| OpenAI | GPT-5.2 |
| Anthropic | Claude Opus 4.6, Sonnet 4.6 |
| Gemini 3.1 Pro (+ Vertex AI) | |
| OpenAI-Compatible | OpenRouter, LiteLLM, any compatible API |
Built with Rust
Single static binary. Zero runtime dependencies. 5 platforms (Windows, Linux x64/arm64, macOS x64/arm64).
GitHub: https://github.com/ZHing2006/krew-cli
Install: npm install -g @zhing2026/krew
Config: krew config init
Run: krew
Feedback and contributions welcome!

Top comments (0)