Open WebUI
Open WebUI is an extensible, feature-rich, and user-friendly self-hosted AI platform designed to operate entirely offline. It supports various Large Language Model (LLM) runners, including Ollama and OpenAI-compatible APIs, with built-in inference engine for RAG. It provides a ChatGPT-style interface for interacting with local or cloud-based models. The project is actively developed with continuous updates.
Warnings
- breaking When using Docker, the `:main` tag always points to the latest build and can include breaking changes without warning. For production or stability, it is recommended to pin a specific version tag (e.g., `:v0.8.6`).
- gotcha A persistent `WEBUI_SECRET_KEY` environment variable is crucial. Without it, Open WebUI generates a random key on each restart, leading to user logouts and 'Error decrypting tokens' for essential secrets like OAuth tokens or API keys.
- gotcha Ensure you are using Python 3.11 for installation to avoid compatibility issues. The library requires Python <3.13.0a1,>=3.11.
- gotcha For Docker deployments with Ollama, connection issues often arise if the WebUI container cannot reach the Ollama server. Using `--network=host` and setting `OLLAMA_BASE_URL` (e.g., `http://127.0.0.1:11434`) in your Docker run command can resolve this.
- gotcha Enabling `ENABLE_REALTIME_CHAT_SAVE` in production is strongly discouraged. It causes massive database write pressure and severe performance degradation under concurrent load.
- gotcha After updating Open WebUI, it's a common issue that the UI might appear broken due to stale frontend assets.
- gotcha Workspace Tools and Functions execute arbitrary Python code on your server. Restrict Workspace access to trusted administrators only, as granting tool creation/import access is equivalent to giving shell access.
Install
-
pip install open-webui
Quickstart
pip install open-webui open-webui serve