Skip to content

Instantly share code, notes, and snippets.

@JBGruber
Last active March 10, 2025 01:26
Show Gist options
  • Save JBGruber/73f9f49f833c6171b8607b976abc0ddc to your computer and use it in GitHub Desktop.
Save JBGruber/73f9f49f833c6171b8607b976abc0ddc to your computer and use it in GitHub Desktop.
My compose file to run ollama and ollama-webui
services:
# ollama and API
ollama:
image: ollama/ollama:latest
container_name: ollama
pull_policy: missing
tty: true
restart: unless-stopped
# Expose Ollama API outside the container stack (but only on the same computer;
# remove 127.0.0.1: to make Ollama available on your network)
ports:
- 127.0.0.1:11434:11434
volumes:
- ollama:/root/.ollama
# GPU support (turn off by commenting with # if you don't have an nvidia gpu)
deploy:
resources:
reservations:
devices:
- driver: nvidia
count: 1
capabilities:
- gpu
# webui, nagivate to http://localhost:3000/ to use
open-webui:
image: ghcr.io/open-webui/open-webui:main
container_name: open-webui
pull_policy: missing
volumes:
- open-webui:/app/backend/data
depends_on:
- ollama
ports:
- 3000:8080
environment:
- "OLLAMA_API_BASE_URL=http://ollama:11434/api"
extra_hosts:
- host.docker.internal:host-gateway
restart: unless-stopped
volumes:
ollama: {}
open-webui: {}
@DrJohan
Copy link

DrJohan commented Mar 10, 2025

Sure! This is how you should do it:

docker-compose down
docker-compose pull --policy "always"
docker-compose up -d

I'm not sure why, but without setting --policy "always", it updates only Ollama, not open-webui.

Thank you so much @JBGruber . This really help.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment