Open WebUI

0.8.12 · active · verified Sat Apr 11

Open WebUI is an extensible, feature-rich, and user-friendly self-hosted AI platform designed to operate entirely offline. It supports various Large Language Model (LLM) runners, including Ollama and OpenAI-compatible APIs, with built-in inference engine for RAG. It provides a ChatGPT-style interface for interacting with local or cloud-based models. The project is actively developed with continuous updates.

Warnings

Install

Quickstart

Install Open WebUI via pip and then start the web server. The UI will typically be accessible at http://localhost:8080 (or http://localhost:3000 if using Docker without --network=host).

pip install open-webui
open-webui serve

view raw JSON →