Open WebUI is an extensible, self-hosted AI interface that supports Ollama, OpenAI-compatible APIs, and more. It's the ChatGPT-like UI you can run on your own hardware.
Why Self-Host Your AI Chat
A company needed an internal AI assistant but couldn't send proprietary data to OpenAI. Open WebUI gave them a ChatGPT-like experience running entirely on their servers with local LLMs.
Key Features:
- Multi-Model — Ollama, OpenAI, Anthropic, and any OpenAI-compatible API
- RAG Built-In — Upload documents and chat with them
- Multi-User — User management with role-based access
- Customizable — Custom models, system prompts, and tools
- Plugin System — Extend with custom tools and functions
Quick Start
docker run -d -p 3000:8080 \
-v open-webui:/app/backend/data \
--name open-webui \
ghcr.io/open-webui/open-webui:main
With Ollama
# Start Ollama
ollama pull llama3
# Start Open WebUI connected to Ollama
docker run -d -p 3000:8080 \
--add-host=host.docker.internal:host-gateway \
-e OLLAMA_BASE_URL=http://host.docker.internal:11434 \
ghcr.io/open-webui/open-webui:main
Why Choose Open WebUI
- Data privacy — everything runs on your infrastructure
- Multi-model — switch between models seamlessly
- RAG built-in — document Q&A without extra setup
- Team ready — multi-user with access controls
Check out Open WebUI docs to get started.
Need AI infrastructure? Check out my Apify actors or email spinov001@gmail.com for custom solutions.
Top comments (0)