project-nomad/admin/docs
Chris Sherwood c16cfc3a93 fix(GPU): detect NVIDIA GPUs via Docker API instead of lspci
The previous lspci-based GPU detection fails inside Docker containers
because lspci isn't available, causing Ollama to always run CPU-only
even when a GPU + NVIDIA Container Toolkit are present on the host.

Replace with Docker API runtime check (docker.info() -> Runtimes) as
primary detection method. This works from inside any container via the
mounted Docker socket and confirms both GPU presence and toolkit
installation. Keep lspci as fallback for host-based installs and AMD.

Also add Docker-based GPU detection to benchmark hardware info — exec
nvidia-smi inside the Ollama container to get the actual GPU model name
instead of showing "Not detected".

Tested on nomad3 (Intel Core Ultra 9 285HX + RTX 5060): AI performance
went from 12.7 tok/s (CPU) to 281.4 tok/s (GPU) — a 22x improvement.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-08 15:18:52 -08:00
..
about.md fix(docs): remove double period after LLC on about page 2026-02-06 14:41:30 -08:00
faq.md fix(GPU): detect NVIDIA GPUs via Docker API instead of lspci 2026-02-08 15:18:52 -08:00
getting-started.md fix(GPU): detect NVIDIA GPUs via Docker API instead of lspci 2026-02-08 15:18:52 -08:00
home.md feat(docs): polish docs rendering with desert-themed components 2026-02-06 14:41:30 -08:00
release-notes.md docs: overhaul in-app documentation and add sidebar ordering 2026-02-06 14:41:30 -08:00
use-cases.md fix(docs): point Wikipedia Selector refs to /settings/zim/remote-explorer 2026-02-06 14:41:30 -08:00