Xoul — An Open-Source AI Agent That Runs Locally
Introducing Xoul, a personal assistant agent powered by local LLMs and virtual machine isolation.
What Is Xoul
Xoul is a personal AI agent. It's not a chatbot — it manages files, sends emails, browses the web, and runs code at the OS level. All actions run inside a QEMU virtual machine, keeping the host system untouched. When using a local LLM, personal data never leaves the machine.
Key Features
- 18 built-in tools — file management, email, web search, code execution, calendar, and more
- Personas & Code Snippets — switch agent roles or run Python snippets shared by the community
- Workflows — schedule repetitive tasks (news digests, server checks, email triage) as multi-step automation templates
- AI Arena — a playground where agents discuss topics and play social deduction games
- Host PC Control — limited host interaction including browser launch and file operations
- Multiple Clients — Desktop (PyQt6), Telegram, Discord, Slack, and CLI
Architecture
The Xoul agent runs inside a QEMU virtual machine. LLM inference is handled locally on the GPU via Ollama, while the desktop app serves as the host-side UI. VM isolation ensures the host system stays safe regardless of what the agent does.
Beyond local LLMs, Xoul also supports commercial APIs (Claude, GPT-5, Gemini, DeepSeek, Grok, Mistral) and external OpenAI-compatible servers (vLLM, LM Studio, etc.).
Supported Models
For local execution, models are automatically recommended based on available VRAM:
| Model | VRAM |
|---|---|
| Nemotron-3-Nano 4B (Q8) | ~5 GB |
| Nemotron-3-Nano 4B (BF16) | ~8 GB |
| GPT-oss 20B | ~13 GB |
| Nemotron-Cascade-2 30B | ~20 GB |
BGE-M3 (embedding) and Qwen 2.5 3B (summarization, CPU-only) are also installed automatically.
System Requirements
| Component | Minimum | Recommended |
|---|---|---|
| CPU | x86-64, 8 cores | — |
| RAM | 8 GB | 16 GB+ |
| GPU | NVIDIA 30-series, 8 GB VRAM | NVIDIA 40-series, 16 GB+ VRAM |
| OS | Windows 11 (10 experimental) | — |
| Disk | 20 GB free | — |
Installation
Quick Start
- Download the release file
- Extract
xoul_rel.zip - Run
install.batinside the extracted folder
install.bat handles file placement, dependency installation, and configuration automatically. Python 3.12, Ollama, and QEMU are installed as needed. An interactive setup walks through language selection, LLM model, VM configuration, user profile, and optional service integrations (Gmail, Tavily, Telegram, etc.).
Install from Source
git clone https://github.com/xoul-project/xoul.git
cd xoul
.\scripts\setup_env.ps1
Once setup completes, the Desktop App launches automatically. After that, you can start it with c:\xoul\desktop\xoul.bat.
Community
Through Xoul Store, you can import workflows, personas, and code snippets created by other users with one click. You can also publish your own.
License
Released under the MIT License.
Links
- Website: https://www.xoulai.net/
- GitHub: https://github.com/xoul-project/xoul
- Discussions: https://github.com/xoul-project/xoul/discussions

Top comments (0)