DEV Community

Ben Racicot
Ben Racicot

Posted on • Originally published at modelpiper.com

Fix Ollama CORS Errors on Mac: One Environment Variable

You pointed a web app at localhost:11434 and got nothing back. The browser console shows a CORS policy error. Ollama blocked your request on purpose.

The fix is one environment variable.

The fix

launchctl setenv OLLAMA_ORIGINS "*" && pkill Ollama; open -a Ollama
Enter fullscreen mode Exit fullscreen mode

Sets OLLAMA_ORIGINS and restarts Ollama. Works in about five seconds.

This doesn't persist across reboots. For a permanent fix, add to ~/.zshrc:

export OLLAMA_ORIGINS="*"
Enter fullscreen mode Exit fullscreen mode

To scope it to specific origins instead of a wildcard:

launchctl setenv OLLAMA_ORIGINS "http://localhost:4200,http://localhost:3000"
Enter fullscreen mode Exit fullscreen mode

Comma-separated, no spaces. Each origin is an exact match including port.

Verify it worked

Open your browser's developer console:

fetch('http://localhost:11434/api/tags')
  .then(r => r.json())
  .then(console.log)
Enter fullscreen mode Exit fullscreen mode

Your model list should print. If you still get a CORS error, Ollama hasn't picked up the new variable - kill and restart: pkill Ollama; open -a Ollama.

Why this happens

Ollama serves on localhost:11434 without CORS headers. When a browser makes a cross-origin request, it sends a preflight OPTIONS request. Ollama responds without Access-Control-Allow-Origin, and the browser kills the actual request before it fires.

This is a security decision, not a bug. Ollama's API can load and unload models - a destructive operation you don't want arbitrary webpages triggering. Shipping without CORS means every browser-based client has to opt in explicitly.

The tradeoff is real. Security by default is the right call for an API server. But it means every new user hits the same wall on their first day.

Common gotchas

Setting disappears after reboot. launchctl setenv doesn't persist across macOS restarts. The ~/.zshrc export is the permanent fix.

Homebrew Ollama ignores shell variables. If you installed via brew services start ollama, it runs as a launchd service that doesn't inherit your shell environment. Use launchctl setenv at the system level, or edit the Homebrew plist directly.

CORS is fixed but requests hang. If the browser error is gone but responses take forever, Ollama might be loading a model on first request. Large models (7B+) take 3-5 seconds to load from disk. Wait for the first response - subsequent requests will be fast.

Is the wildcard safe? For local development, yes. Ollama only listens on localhost by default - remote machines can't reach it regardless of the CORS setting. The wildcard allows any webpage on your machine to make requests to the API, which is fine for development. Only scope to specific origins if you've bound Ollama to 0.0.0.0.

The alternative

If managing environment variables per-tool isn't your idea of a good time - ToolPiper bundles llama.cpp directly with CORS headers built in. Same GGUF models, same Metal GPU acceleration, zero configuration. It also connects to your existing Ollama instance, so you can use both without choosing.

Full article with more edge cases

Top comments (0)