DEV Community

Cheikh Seck
Cheikh Seck

Posted on

Don't pay for an AI sandbox — build one locally with Docker

Don't pay for an AI sandbox. Build one locally that's secure with Docker.

Here's a practical guide that spins up a locked-down environment with a TUI, a persistent config volume, and a proxy to your local Ollama:

https://github.com/cheikh2shift/godex/tree/main#running-securely-with-docker

Why this setup?

  • Runs locally (no vendor lock-in)
  • Sandboxed tools (bash, filesystem, webscraper)
  • Config persisted in a Docker volume
  • Works from any project directory via WORKSPACE_DIR

Quick start

UID=$(id -u) GID=$(id -g) WORKSPACE_DIR="$PWD" docker compose -f $HOME/godex/docker-compose.yml up -d && docker attach godex && docker compose -f $HOME/godex/docker-compose.yml down
Enter fullscreen mode Exit fullscreen mode

Ollama note
If you're using a host Ollama instance via the nginx proxy, make sure Ollama listens on 0.0.0.0:11434 (not just 127.0.0.1).

security #docker #opensource

Top comments (0)