WELCOME Garudian!!!
Most AI agent frameworks share the same two problems: they're heavy to deploy, and they forget everything the moment a session ends.
Garudust takes a different approach. It's a self-hostable AI agent runtime written entirely in Rust — a single ~10 MB binary that starts in under 20 milliseconds, remembers what you teach it across sessions, and gets smarter with every conversation.
No Python runtime. No Docker required for local use. No cloud dependency.
Current release: v0.2.8 — released 12 May 2025. The project moved from v0.1.0 to v0.2.8 in under two weeks, so it's moving fast.
What Makes Garudust Different?
There's no shortage of AI agent frameworks. Here's why Garudust stands out:
| Garudust | Python-based agents | |
|---|---|---|
| Binary size | ~10 MB | 100–500 MB (with deps) |
| Cold start | < 20 ms | 2–10+ seconds |
| Memory | Persistent across sessions | Usually in-memory only |
| Self-improvement | Built-in skill learning loop | Manual, if at all |
| Multi-platform | Telegram, Discord, Slack, LINE, WhatsApp, Matrix, HTTP | Varies |
| LLM backends | Any OpenAI-compatible + Bedrock | Varies |
The self-improvement loop is what makes it genuinely different. When Garudust discovers a repeatable multi-step workflow during a session, it writes a skill file automatically. Next time you ask for the same thing, it already knows how to do it.
Architecture at a Glance
Garudust is a multi-crate Rust workspace. Every concern is its own crate:
crates/
garudust-core Shared traits & types — zero I/O
garudust-transport LLM adapters (Anthropic, OpenRouter, Bedrock, Ollama, vLLM, …)
garudust-tools Tool registry + built-in toolsets
garudust-memory FileMemoryStore (markdown) + SessionDb (SQLite + FTS5)
garudust-agent Agent run loop, context compressor, prompt builder
garudust-platforms Telegram, Discord, Slack, LINE, WhatsApp, Matrix, Webhook
garudust-cron Cron scheduler
garudust-gateway axum HTTP gateway — /chat, /chat/stream, /chat/ws, /metrics
bin/
garudust CLI: interactive TUI, one-shot, setup, doctor, config
garudust-server Headless: all platforms + HTTP + cron in one process
This composable design means you can add a new tool, platform adapter, or LLM transport without touching anything outside the relevant crate. Most contributions are under 100 lines.
Getting Started in 3 Minutes
Install
Download a pre-built binary from GitHub Releases — no Rust required:
tar -xzf garudust-*-x86_64-unknown-linux-musl.tar.gz
sudo mv garudust garudust-server /usr/local/bin/
Or install directly from crates.io:
cargo install garudust garudust-server
Or build from source (requires Rust 1.75+):
git clone https://github.com/garudust-org/garudust-agent
cd garudust-agent
cargo build --release
Configure
Run the setup wizard — it walks you through provider selection and saves your API key:
garudust setup
Garudust supports: Anthropic, OpenRouter, AWS Bedrock, Ollama, vLLM, or any OpenAI-compatible endpoint. Swap providers with a single env var — no code changes needed.
The wizard pre-fills existing values and masks secrets. Press Enter to keep a value unchanged — a small detail that makes re-configuration much less painful.
Chat
garudust # interactive TUI
Or run a one-shot task:
garudust "summarise the git log from the last 7 days into a changelog"
garudust --model anthropic/claude-opus-4-7 "review this PR for security issues"
Persistent Memory
This is the feature that changes how you work with the agent day to day.
Garudust automatically saves facts to ~/.garudust/memory/ — preferences, project conventions, corrections you make:
You: always format JSON with 2-space indent
Agent: Got it — I'll use 2-space indent for JSON from now on.
The next session starts with that preference already loaded. You never repeat yourself.
A built-in nudge fires every few iterations during long tasks, prompting the agent to persist new facts before the session ends:
# ~/.garudust/config.yaml
nudge_interval: 5 # inject memory reminder every 5 LLM iterations (0 = off)
Memory is also protected from prompt injection — recalled facts are wrapped in <untrusted_memory> tags so the model treats them as data, not instructions.
What gets saved:
- Preferences — output format, language, tone, tool choices
- Project details — paths, configs, conventions, known quirks
- Corrections — anything you tell the agent to stop doing
What does not get saved: session logs, task progress, one-off details. Only facts that will matter in future sessions.
The Skill System
Skills are reusable instruction sets stored as plain Markdown files in ~/.garudust/skills/. They're hot-reloaded on every call — edit a file and the next message picks up the change immediately.
~/.garudust/skills/
git-workflow/SKILL.md
daily-standup/SKILL.md
rust-code-review/SKILL.md
Skills are created automatically
When Garudust discovers a multi-step workflow, it writes the skill itself:
You: write a skill for reviewing Rust PRs
Agent: [calls write_skill]
Saved to ~/.garudust/skills/rust-code-review/SKILL.md
Skills stay up to date
The skill reflection pipeline (v0.2.0) automatically updates skills after complex tasks. If steps are outdated or wrong, the agent patches the file — no manual maintenance required.
Skill-level tool permissions (v0.2.1)
Each skill declares exactly which tools it's allowed to use via frontmatter:
---
name: git-workflow
description: Opinionated Git commit and PR workflow
version: 1.0.0
tools: [terminal, read_file, write_file]
---
Always write conventional commits. Run tests before pushing.
Open a draft PR first, then mark ready when CI is green.
This limits the blast radius of any skill — it can't call tools it hasn't declared.
Install skills from the hub (v0.2.5)
garudust skill install facebook-workflow
garudust skill list
garudust skill uninstall facebook-workflow
Skills pull from garudust-hub and install their required tools automatically. If the tool needs a runtime (python3, node, etc.) that isn't on PATH, you get a warning immediately at install time — not a silent failure at run time.
Multi-Platform Gateway
garudust-server runs HTTP, all platform adapters, and cron jobs in a single process. Set the relevant env vars and start the server — every adapter runs together.
| Platform | Required env var(s) |
|---|---|
| Telegram | TELEGRAM_TOKEN |
| Discord | DISCORD_TOKEN |
| Slack |
SLACK_BOT_TOKEN + SLACK_APP_TOKEN
|
| LINE |
LINE_CHANNEL_SECRET + LINE_CHANNEL_ACCESS_TOKEN
|
| Meta Cloud API credentials | |
| Matrix |
MATRIX_HOMESERVER + MATRIX_USER + MATRIX_PASSWORD
|
| HTTP | Always-on at POST /chat — no token needed |
TELEGRAM_TOKEN=123:ABC \
LINE_CHANNEL_SECRET=abc... \
garudust-server --anthropic-key sk-ant-...
One binary, every channel.
HTTP API
Three transport modes:
# Blocking
curl -X POST http://localhost:3000/chat \
-H "Content-Type: application/json" \
-d '{"message": "write a haiku about Rust"}'
# Streaming (Server-Sent Events)
curl -X POST http://localhost:3000/chat/stream \
-H "Content-Type: application/json" \
-d '{"message": "explain async/await in 3 sentences"}'
# WebSocket: ws://localhost:3000/chat/ws
# Send: {"message": "..."} → Receive: text chunks ... {"done": true}
Prometheus-compatible metrics at /metrics. Health check at /health.
Cron Scheduling
Run agent tasks on a schedule with a single env var:
GARUDUST_CRON_JOBS="0 9 * * *=Write a morning briefing and save to ~/briefing.md" \
garudust-server --anthropic-key sk-ant-...
Combined with the skill system and a platform adapter, this becomes a powerful automation layer — daily social media posts, standup summaries, report generation, or any workflow you can describe in plain text.
Built-in Tools
| Tool | What it does | Since |
|---|---|---|
web_fetch |
Fetch any URL (capped at 512 KB) | v0.1.0 |
web_search |
Brave Search API; falls back to DuckDuckGo | v0.1.0 |
browser |
Control Chrome via CDP — navigate, click, screenshot, JS | v0.1.0 |
read_file / write_file
|
Filesystem access | v0.1.0 |
terminal |
Run shell commands (sandboxed via Docker, opt-in) | v0.1.0 |
memory |
Persistent key-value store | v0.1.0 |
session_search |
Full-text search across past conversations (SQLite FTS5) | v0.1.0 |
delegate_task |
Spawn parallel sub-agents; results returned in order | v0.1.0 |
list_directory |
Browse filesystem | v0.2.0 |
http_request |
Make REST API calls directly without curl | v0.2.0 |
pdf_read |
Extract text from PDF files | v0.2.0 |
write_skill |
Create or update skills on the fly | v0.1.0 |
MCP Support
Connect any MCP server in ~/.garudust/config.yaml:
mcp_servers:
- name: filesystem
command: npx
args: ["-y", "@modelcontextprotocol/server-filesystem", "/tmp"]
- name: postgres
command: npx
args: ["-y", "@modelcontextprotocol/server-postgres", "postgresql://localhost/mydb"]
Security
A few things worth highlighting for anyone self-hosting:
-
Terminal sandbox — shell commands run inside a hardened Docker container (
--cap-drop ALL,--security-opt no-new-privileges:true, tmpfs/tmp). Opt-in viaterminal_sandbox: dockerin config. - ConstitutionalApprover — LLM-based approval gate for commands, not regex that can be bypassed via shell obfuscation.
-
DNS TOCTOU fix — a custom
SafeResolvercloses the gap where a hostname could resolve to a private IP between the check and the actual request. -
Prompt injection protection — recalled memory is wrapped in
<untrusted_memory>tags so the model treats it as data, not instructions. - Structured audit log — every tool call is logged at INFO level with session ID, tool name, and arguments.
- LLM retry with exponential backoff — transient 429/5xx errors are retried automatically.
Usage Footer
Every completed task appends a summary (added in v0.2.2):
[5 iter | 24657in 689out tok | ~$0.003 @ Qwen3-14B-AWQ]
Iterations, token counts, estimated cost, and model — useful for understanding what tasks actually cost at scale.
Self-Hosted LLM Support
If you run your own LLM (vLLM, Ollama), Garudust connects with zero friction:
# Ollama
OLLAMA_BASE_URL=http://localhost:11434
GARUDUST_MODEL=llama3.2
# vLLM
VLLM_BASE_URL=http://localhost:8000/v1
VLLM_API_KEY=token-abc123
GARUDUST_MODEL=Qwen/Qwen3-14B-AWQ
No cloud required. Your data stays local.
Docker
echo "OPENROUTER_API_KEY=sk-or-..." > .env
docker compose up
The compose file is included in the repo.
Contributing
Good first issues:
-
New tool — wrap any CLI or API as a
Toolimpl ingarudust-tools -
New platform — implement
PlatformAdapter(Signal, etc.) - Improve TUI — multi-line input, syntax highlighting, mouse support
- Tests — integration, property, snapshot
git clone https://github.com/garudust-org/garudust-agent
cd garudust-agent
cargo build
cargo test --workspace
cargo clippy --workspace --all-targets -- -W clippy::all -W clippy::pedantic
Links
- Repo: github.com/garudust-org/garudust-agent
- Skill & Tool Hub: github.com/garudust-org/garudust-hub
- crates.io: crates.io/crates/garudust
- docs.rs: docs.rs/garudust-agent
- Latest release: v0.2.8
- License: MIT
Built by developers who got tired of agent frameworks that weigh a gigabyte, forget everything, and require three cloud services to run "locally."
Top comments (0)