{aiAssistantName}
Easily manage the {aiAssistantName}'s settings and installed models. We recommend starting with smaller models first to see how they perform on your system before moving on to larger ones.
{!isInstalled && (
Connect to any OpenAI-compatible API server — Ollama, LM Studio, llama.cpp, and others are all supported.
For remote Ollama instances, the host must be started with OLLAMA_HOST=0.0.0.0.
Currently configured: {props.models.settings.remoteOllamaUrl}
)}{remoteOllamaError}
)}Model downloading is only supported when using an Ollama backend. If you are connected to an OpenAI API host (e.g. LM Studio), you will need to download models directly in that application.
{record.name}
{record.description}
| Tag | Input Type | Context Size | Model Size | Action |
|---|---|---|---|---|
| {tag.name} | {tag.input || 'N/A'} | {tag.context || 'N/A'} | {tag.size || 'N/A'} |
|