The Problem: The "Copy-Paste" and "AI rephrase" Fatigue
How many times a day do you do this?
- Write a rough email or a Slack message.
- Realize it sounds too blunt or unprofessional.
- Copy the text.
- Alt-Tab to a browser.
- Paste it into ChatGPT/Claude with a "make this better" prompt.
- Copy the result.
- Alt-Tab back and paste. `` It’s a workflow killer. I wanted a way to "fix" my writing exactly where I was typing, without the overhead of a heavy browser or a 500MB Electron app sitting in my RAM.
That’s why I built PhrasePoP.
What is PhrasePoP?
PhrasePoP is a minimalist, open-source desktop utility that "pops" up over any application via a global shortcut. It’s designed specifically to help you rephrase sentences, polish grammar, or turn quick bullet points into full email responses instantly.
✨ Key Features
- Global Overlay: Trigger it anywhere with a hotkey. It stays hidden until you need it.
- Rephrase & Reply: Turn "no time for this meeting" into a professional "I'm currently at capacity but would love to sync later."
- Privacy & Local AI: It supports Ollama and LocalAI. If you don't want your data leaving your machine, you can run everything locally.
- Cloud Support: Prefer speed? It also hooks into OpenAI, Anthropic, and other major providers.
- Lightweight AF: Built with Rust and Tauri, so it uses minimal resources.
Why I Chose the Rust + Tauri Stack 🦀
As developers, we care about our system resources. I didn't want another Chrome-instance-masked-as-an-app.
- Memory Footprint: By using Tauri, the frontend is rendered using the OS's native webview, and the backend logic is pure Rust. This results in an idle RAM usage of about 50MB—compared to the 400MB+ typical of Electron apps.
- Security: Rust's memory safety makes handling clipboard data and API keys much more reliable.
- Speed: The "Pop" needs to be instant. The bridge between Rust and the webview ensures that the UI feels snappy and responsive.
Privacy First: Local LLMs
One of the biggest hurdles for AI tools in a professional setting is privacy. Many developers (and companies) aren't comfortable sending every internal email draft to a third-party API.
PhrasePoP allows you to point to a local endpoint. If you have Ollama running a model like llama3 or mistral, PhrasePoP can use it as the engine. Your drafts stay on your hardware.
Open Source & Future
PhrasePoP is 100% open source. I built it to solve my own frustration, but I’d love to see how the community uses it.
How you can help:
- Give it a star: If you find the concept cool! GitHub Link
- Contribute: I'm looking for help with better Linux window management and more "Writing Mode" templates.
- Feedback: What’s one writing task that drains your energy every day? Let’s automate it.
Check out the repo here: 👉 https://github.com/vinzify/PhrasePoP
I'd love to hear your thoughts in the comments! Do you prefer local LLMs for your workflow, or are you all-in on cloud APIs?
Top comments (0)