Helix AI Studio v2.1.0 ships with gemma4 support, 118 tests, and refreshed docs.
What is it?
An all-in-one AI chat studio connecting 7 providers through one UI:
- Ollama — gemma4:31b, qwen3.5, any local model
- Claude API / OpenAI API / vLLM
- Claude Code CLI / Codex CLI / Gemini CLI
100% local-capable. Docker Compose ready.
Key Features
- WebSocket streaming chat
- RAG knowledge base (hybrid search + reranker)
- MCP tool integration
- Mem0 shared memory
- Pipeline (Plan → Execute → Verify)
- CrewAI multi-agent
- Dark theme, i18n (EN/JP)
What Makes It Different
The only AI chat studio with CLI integration for Claude Code, Codex, and Gemini CLI. Most alternatives (Open WebUI, LobeChat) only support API models.
v2.1.0 Changes
- gemma4:31b as default model (released April 2, AIME 89.2%)
- 118 pytest tests added (was 0)
- README with competitive positioning
Quick Start
\bash
git clone https://github.com/tsunamayo7/helix-ai-studio.git
cd helix-ai-studio
uv sync && uv run python run.py
\\
Top comments (0)