I’ve always loved small, fast workflows that make the computer feel like an extension of thought.
A few weeks ago, while juggling docs, terminals, and editors, I realized: why not bring AI directly to the clipboard?
With Gemma 3 running locally through Ollama, and a macOS Shortcut bound to a hotkey, we can get instant grammar fixes, translations, and command synthesis in seconds — entirely offline.
No web UI, no IDE plugins, just:
copy → press → paste → done
Why Build This Instead of Using Built-In Grammar Tools?
Most apps already try to help you write “better”: Google Docs, Slack, Notion, even IDEs highlight typos.
But none of them feel as consistent or fast as a hotkey you can call anywhere - terminal, browser, or email.
Here’s what this Shortcut + Gemma 3 combo gives you:
- Consistency across apps - One shortcut works everywhere, no context switching.
- Custom behavior - A one-line prompt defines your tone, brevity, formatting, or dialect.
- Privacy & offline use - Everything runs locally via Ollama. Nothing leaves your machine.
- Beyond grammar - The same setup can do translation, summarization, regex generation, or log cleanup.
Why Gemma 3 for Hotkeys?
Hotkey workflows live or die by latency.
Gemma 3 offers parameter sizes like 270M, 1B, and 4B - small enough to run smoothly on Apple Silicon and most modern CPUs.
On my M2 Pro, the 4B
model responds in under a second for short text.
That’s fast enough to keep flow intact, even mid-sentence.
Smaller quantized versions (4-bit) reduce memory footprint further, which helps when you’ve got your editor, browser, Docker, and Slack all open - because, let’s be honest, that’s how we all work.
Privacy and Offline Capability
Once Gemma 3’s weights are downloaded, everything happens locally.
No data leaves your machine - perfect for working with code, logs, or internal docs.
You can even work offline on a plane or behind a firewall. The Shortcut still fires instantly.
Developer Integration and Simplicity
Gemma 3’s instruction-tuned variants respond to natural, concise prompts like:
“Fix grammar”
“Translate to French”
“Summarize this paragraph”
That aligns perfectly with small, predictable tasks you want on a hotkey.
Ollama exposes both a CLI and a local API (compatible with OpenAI’s Chat Completions format).
That means you can integrate through simple shell pipes like:
pbpaste | ollama run gemma3:4b | pbcopy
Or you can call a local HTTP endpoint from a Shortcut.
Setup: macOS + Ollama + Gemma 3
Install Ollama
brew install ollama
ollama --version
Pull the Model
ollama pull gemma3:4b
Other sizes you can try:
Variant | Description |
---|---|
gemma3:270m | Ultra-light, very fast |
gemma3:1b | Balance of size and speed |
gemma3:4b | Great for grammar & translation |
gemma3:12b+ | Highest quality, slower |
Then verify the model:
echo "hello" | ollama run gemma3:4b
If you see a reply - you’re good to go.
Prompt Files
I keep my prompts in plain .txt files under ~/prompts.
That makes versioning and editing easy.
Here’s an example for grammar correction:
You are an expert copy editor. Given the text, return the same text with grammar, spelling, and punctuation corrected only.
Rules:
* Do not rephrase, shorten, or change meaning or tone.
* Preserve Markdown, emojis, URLs, and code blocks.
* Keep the author's dialect if evident; otherwise default to American English.
* Apply the minimal change necessary.
* Output only the corrected text - no explanations or code fences.
Clipboard → Model → Clipboard
This small shell script connects everything:
#!/bin/zsh
export PATH="/opt/homebrew/bin:/usr/local/bin:/usr/bin:/bin"
export LANG=en_US.UTF-8
export LC_ALL=en_US.UTF-8
TEXT="$(pbpaste)"
PROMPT_FILE="$HOME/prompts/grammar_correction_prompt.txt"
PROMPT=$(<"$PROMPT_FILE")
OUTPUT="$(
printf '%s\n\n%s\n' "$PROMPT" "$TEXT"
| ollama run gemma3:4b
)"
printf "%s\n" "$OUTPUT" | pbcopy
echo "✅ Done! Paste corrected text and enjoy."
You can already run it manually:
pbpaste | ./grammar_correct.sh | pbcopy
Building the Shortcut
- Open Shortcuts on macOS.
- Create a new Shortcut → Run Shell Script.
- Set the shell to /bin/zsh.
- Paste the script above.
- Under Details, enable Use as Quick Action.
- Assign a keyboard shortcut — mine is ⌃⌘G.
Now the loop is complete:
Copy → Press Hotkey → Paste → Done.
Testing It
Copy any text - maybe an email draft or a snippet of documentation.
Press your shortcut, then paste.
You’ll see corrected text instantly, without ever leaving your current app.
I’ve been using this daily - for proofreading commits, cleaning up release notes, or rewording code comments.
It’s one of those automations that quietly changes how you work.
When to Use (and When Not To)
Great for:
- Clipboard-sized text: short paragraphs, commit messages, doc comments.
- Low-latency tasks - speed matters more than reasoning.
- Privacy-sensitive material - logs, code, internal text.
- Offline or restricted environments.
- Repeatable micro-automations you want to trigger anywhere.
Not ideal for:
- Multi-file code refactors or big design reasoning.
- Anything that needs long context memory.
- Interactive IDE features like autocomplete or refactoring hints.
Think of it as a command-line Copilot - but local, minimal, predictable.
Final Thoughts
There’s something deeply satisfying about an AI workflow that feels native - not glued on.
Gemma 3 + macOS Shortcuts achieves exactly that: no login, no latency, no cloud.
If you love building small tools that make your daily flow smoother, this one’s worth 15 minutes of setup.
It quickly becomes muscle memory - copy, hotkey, paste, done.
For more detailed instruction visit Full post
Top comments (0)