project-nomad/admin/app
chriscrosstalk 6646b3480b fix(AI): stop local nomad_ollama container when remote Ollama is configured (#744)
When users set a remote Ollama URL via AI Settings, the local nomad_ollama
container continued running and competed with the remote host for port 11434
and GPU access. Now configureRemote stops the local container on set and
restores it on clear (if still present). Container and its models volume are
preserved so the local install can be re-enabled later.

Closes #662

Co-authored-by: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-21 14:26:28 -07:00
..
controllers fix(AI): stop local nomad_ollama container when remote Ollama is configured (#744) 2026-04-21 14:26:28 -07:00
exceptions fix(Docs): documentation renderer fixes 2025-12-23 16:00:33 -08:00
jobs fix(AI): allow cancelling in-progress model downloads and ensure consistent progress UI (#701) 2026-04-21 14:26:28 -07:00
middleware feat: gzip compression by default for all registered routes 2026-04-03 14:26:50 -07:00
models feat(maps): add scale bar and location markers (#636) 2026-04-03 14:26:50 -07:00
services docs: add Community Add-Ons page with field manuals + W3Schools packs (#753) 2026-04-21 14:26:28 -07:00
utils fix(Downloads): remove duplicate err listnr and improv Range req stability 2026-04-21 14:26:28 -07:00
validators fix: block IPv4-mapped IPv6 and IPv6 all-zeros in SSRF check (#520) 2026-04-03 14:26:50 -07:00