Running your own AI chatbot shouldn't require a PhD or a cloud bill.
ChatterMate is an open-source (AGPL) AI chat agent built from scratch to work with LLMs. Pair it with Ollama and you get a fully local AI assistant — no API keys, no third-party data processing.
Why Self-Host?
- Privacy: Customer conversations never leave your infrastructure
- Cost: No per-token API charges
- Control: Pick your model, tune your prompts, own your data
Quick Start
- Clone the repo:
git clone https://github.com/chattermate/chattermate.chat - Use Docker Compose to spin everything up
- Point it at your Ollama instance
- You're live
The Stack
- Backend: Python
- Frontend: Vue.js
- LLM: Ollama (local) or any OpenAI-compatible API
- Database: PostgreSQL
Current Stats
58 GitHub stars, 26 forks, 4 contributors, v1.0.9 released.
We're actively looking for contributors — the issue tracker is clean and we're happy to mentor first-time OSS contributors.
Check it out: github.com/chattermate/chattermate.chat
Top comments (0)