//nbkelley /homelab

Open WebUI Deployment#

What Was Established#

Open WebUI is deployed via Docker to provide a ChatGPT-like interface for interacting with local Ollama instances. It is configured to connect to the host’s Ollama API.

Key Decisions#

Because the WebUI runs inside a Docker container, it cannot reach localhost:11434 of the host machine directly. The OLLMA_BASE_URL must point to the host’s actual LAN IP or use the host.docker.internal gateway.

Current Configuration#

Docker Deployment#

docker run -d \
  --name open-webui \
  --restart always \
  -p 3000:8080 \
  -v open-webui:/app/backend/data \
  --add-host=host.docker.internal:host-gateway \
  -e OLLAMA_BASE_URL=http://<YOUR_HOST_IP>:11434 \
  ghcr.io/open-webui/open-webui:main

Note: Replace <YOUR_HOST_IP> with the actual IP of the machine (e.g., 192.168.172.168) to ensure the container can route to the Ollama service.

Ollama Configuration, Homelab Dashboard

Jellyfin LXC GPU Passthrough & Hardware Acceleration

Sources#

Homelab AI - 2026-04/13 · ingested/chats/Homelab AI - 2026-04-13