I’ve been building Xandai-CLI because I wanted AI help without leaving the terminal or wiring up cloud APIs.
The idea is simple: one CLI where I can run shell commands, edit files, and ask questions in the same place — using local models (Ollama / LM Studio).
Repo: https://github.com/XandAI-project/Xandai-CLI
What Xandai-CLI does
Xandai-CLI is an interactive terminal tool that mixes:
normal shell commands
natural language prompts
file creation and editing
optional multi-step “agent” behavior
You type at the xandai> prompt, and it figures out whether you meant a command, a question, or a task.
Examples:
xandai> git status
xandai> explain what this repo does
xandai> create a python script to parse this log file
Everything runs locally if you want it to.
Why I built it
Most AI coding tools either:
live in the browser, or
require API keys and cloud calls, or
feel bolted on to the terminal
I wanted something that felt closer to git or fzf:
always there, fast, and not intrusive.
Xandai-CLI is opinionated around developer workflows and local control.
Local models first
Xandai-CLI supports:
Ollama
LM Studio
You can auto-detect what’s running and start using it immediately. No accounts, no tokens.
Install & try it
pip install xandai-cli
xandai --auto-detect
From there, just use it like a normal shell — but with the option to ask for help or generate code when you need it.
Status
This is still evolving, and I’m actively iterating on it based on how I actually use it day-to-day.
If you try it and something feels off, or you have ideas, issues and PRs are welcome.
Top comments (0)