DEV Community

David Simões
David Simões

Posted on

How I Built a Privacy-First AI Assistant That Runs 100% Locally with Node.js and Ollama

Most AI assistants today depend heavily on the cloud. Your data leaves your device, goes to external servers, and you depend on third-party APIs.

I wanted to explore a different approach.

So I built CrustAI — a self-hosted AI assistant that runs entirely on your machine, powered by local LLMs through Ollama, and integrated in real time with Telegram, WhatsApp, Discord and Slack.

Key ideas behind the project:

  • Total privacy (no data leaves your machine)
  • Real-time messaging integrations
  • Long-term memory between conversations
  • Offline speech-to-text and text-to-speech
  • Extensible Node.js architecture with REST API

This is not theoretical. You can clone the repo and run it today.

GitHub repository:
https://github.com/DaveSimoes/CrustAI

I’d love to hear thoughts from the community.

Top comments (0)