I kept missing things. Not because I wasn't paying attention, but because the signal was spread across too many tools. A Slack message here, a GitHub issue there, an email I skimmed and forgot about. By the time I connected the dots it was too late.
So I built OWL. It watches your data sources, builds a knowledge graph locally, and runs LLM analysis on a schedule to find things you wouldn't have caught yourself.
How it works
You connect your sources. Gmail, Google Calendar, GitHub, Slack, Shopify, local files. OWL pulls events from all of them into a SQLite database and maps out entities and relationships.
Then on a schedule it runs discovery passes through your LLM. Not summaries. It looks for cross-source correlations, anomalies, patterns, and connections.
For example: one company showed up in a GitHub issue, 3 Shopify orders, and an urgent onboarding email all in the same day. Normally they generate maybe 1 event a week. No single app would have flagged that. OWL did.
What it looks like
The web dashboard has a D3 force-directed knowledge graph showing all your entities and how they connect. Discoveries show up in a feed with urgency levels and confidence scores traced back to source signals.
There's also an Electron desktop app with a system tray, native notifications when new discoveries come in, and a global hotkey (Ctrl+Shift+O) to pull it up from anywhere.
The 5 ways to run it
- CLI - npm install -g owl-ai && owl setup && owl start
- Desktop app - Electron with knowledge graph, notifications, system tray
- Web dashboard - owl dashboard, opens localhost with the full UI
- Docker - docker compose up -d, done
- MCP server - Add it to Claude Desktop, Cursor, or Windsurf and they can query your world model directly
It's fully local
This was non-negotiable for me. Everything runs on your machine. The SQLite database, the daemon, the dashboard. The only external calls are to your data source APIs and whatever LLM you configure.
Works with Ollama if you want zero cloud. Works with OpenAI or Anthropic if you prefer that. No accounts, no telemetry, no data leaving your machine.
The discovery engine
OWL doesn't just store data. It has three scan types that run on cron:
Quick scans every 30 minutes for urgent stuff
Deep scans every 6 hours for cross-source analysis
Daily reviews every morning for patterns and situations
It also learns from your feedback. If you dismiss a discovery it dampens that type. If you engage with one it boosts similar signals. Over time the noise drops and the signal gets sharper.
Delivery channels
Discoveries go wherever you already are:
Terminal (CLI)
Telegram (with conversational follow-up by replying)
Slack (Block Kit formatting, thread-based follow-up)
Discord (rich embeds with urgency colors)
Email digest (batched HTML, grouped by entity/theme)
Webhooks (POST JSON to n8n, Zapier, whatever)
RSS/Atom feed
WhatsApp
The plugin system
Each data source is a plugin with a simple contract: setup, watch, query, plus a metadata file. There are 7 built-in ones but the goal is that someone could write a new plugin in a day. If you have a data source you want connected I'm happy to help or take a PR.
Some numbers
98 passing tests
7 data source plugins
8 delivery channels
5 deployment modes
~8,000 lines of code
MIT license
Try it
npm install -g owl-ai
owl demo
owl dashboard
The demo seeds fake data so you can see the full dashboard and knowledge graph without connecting anything real.
Repo: github.com/msaule/owl
Docs: msaule.github.io/owl
If you have questions about the architecture or want to contribute a plugin, I'm around!
Top comments (0)