Every "invisible AI assistant" I tried was secretly running a full Chromium browser. 200MB of RAM before you type a single character.
So I built Veil — a ~5MB native Swift macOS app that does the same thing. Invisible to Zoom, Teams, OBS, QuickTime. Connects to Ollama, OpenAI, Claude, and more.
The problem with Electron-based alternatives
The "invisible AI" space exploded after Cluely went viral. But look at what everyone built:
Pluely → Tauri + WebView
Natively → Electron (~150MB)
Vysper → Electron (~200MB)
All of them ship a browser engine to display a chat window. On macOS, that's unnecessary — AppKit can do everything natively, faster, with a fraction of the resources.
How the invisibility actually works
One line of AppKit:
swiftwindow.sharingType = .none
That's it. NSWindow.sharingType is a public, documented Apple API that controls whether a window participates in macOS's display compositor capture pipeline.
Setting .none tells the compositor to exclude the window from all capture operations — before Zoom, OBS, CGWindowListCreateImage, or SCStreamConfiguration ever sees it.
The window renders normally on your physical display. It simply does not exist to screen recorders.
No hacks. No injection. No overlay tricks. A documented public API that's been in AppKit for years.
Why native Swift matters
When you build with AppKit instead of Electron:
App size: ~5MB vs 150–500MB
Startup time: instant vs 2–5 seconds
RAM at idle: ~30MB vs 200MB+
macOS integration: native — no Dock icon, no Mission Control entry, proper menu bar citizen
If you're building a macOS-only tool, there's no reason to ship Chromium.
What Veil supports
Backends: Ollama, OpenAI, Anthropic, OpenRouter, NVIDIA NIM (free tier — hundreds of models, no credit card), LM Studio, llama.cpp. Any OpenAI-compatible endpoint works.
Voice input via whisper-cpp — fully on-device, nothing leaves your machine.
Screenshot analysis — capture your screen, attach to the message. The AI sees your screen. The screen recorder doesn't see Veil.
Use cases
The obvious one is technical interviews — LeetCode, HackerRank, system design rounds. Ask for hints, complexity analysis, architecture patterns while sharing your screen. But it's also useful for any situation where you want AI assistance without it being visible: live demos, client calls, presentations.
The project
GitHub: https://github.com/rbc33/Veil
MIT license, free forever, no telemetry, no account needed.
If you're on macOS and paying for Cluely — or using one of the Electron alternatives — give it a try. And if you're a Swift developer, PRs are very welcome.
Top comments (0)