Compare commits
No commits in common. "main" and "v1.30.3" have entirely different histories.
4
.github/workflows/build-disk-collector.yml
vendored
|
|
@ -35,13 +35,13 @@ jobs:
|
|||
- name: Checkout code
|
||||
uses: actions/checkout@v6
|
||||
- name: Log in to GitHub Container Registry
|
||||
uses: docker/login-action@v4
|
||||
uses: docker/login-action@v2
|
||||
with:
|
||||
registry: ghcr.io
|
||||
username: ${{ github.actor }}
|
||||
password: ${{ secrets.GITHUB_TOKEN }}
|
||||
- name: Build and push
|
||||
uses: docker/build-push-action@v7
|
||||
uses: docker/build-push-action@v5
|
||||
with:
|
||||
context: install/sidecar-disk-collector
|
||||
push: true
|
||||
|
|
|
|||
4
.github/workflows/build-primary-image.yml
vendored
|
|
@ -35,13 +35,13 @@ jobs:
|
|||
- name: Checkout code
|
||||
uses: actions/checkout@v6
|
||||
- name: Log in to GitHub Container Registry
|
||||
uses: docker/login-action@v4
|
||||
uses: docker/login-action@v2
|
||||
with:
|
||||
registry: ghcr.io
|
||||
username: ${{ github.actor }}
|
||||
password: ${{ secrets.GITHUB_TOKEN }}
|
||||
- name: Build and push
|
||||
uses: docker/build-push-action@v7
|
||||
uses: docker/build-push-action@v5
|
||||
with:
|
||||
push: true
|
||||
tags: |
|
||||
|
|
|
|||
2
.github/workflows/release.yml
vendored
|
|
@ -27,7 +27,7 @@ jobs:
|
|||
fetch-depth: 0
|
||||
persist-credentials: false
|
||||
- name: semantic-release
|
||||
uses: cycjimmy/semantic-release-action@v6
|
||||
uses: cycjimmy/semantic-release-action@v3
|
||||
id: semver
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.COSMISTACKBOT_ACCESS_TOKEN }}
|
||||
|
|
|
|||
|
|
@ -74,11 +74,11 @@ Because Nomad relies heavily on Docker, we actually recommend against installing
|
|||
1. **Sync with upstream** before starting any new work. We prefer rebasing over merge commits to keep a clean, linear git history as much as possible (this also makes it easier for maintainers to review and merge your changes). To sync with upstream:
|
||||
```bash
|
||||
git fetch upstream
|
||||
git checkout dev
|
||||
git rebase upstream/dev
|
||||
git checkout main
|
||||
git rebase upstream/main
|
||||
```
|
||||
|
||||
2. **Create a feature branch** off `dev` with a descriptive name:
|
||||
2. **Create a feature branch** off `main` with a descriptive name:
|
||||
```bash
|
||||
git checkout -b fix/issue-123
|
||||
# or
|
||||
|
|
@ -130,7 +130,26 @@ chore(deps): bump docker-compose to v2.24
|
|||
|
||||
Human-readable release notes live in [`admin/docs/release-notes.md`](admin/docs/release-notes.md) and are displayed directly in the Command Center UI.
|
||||
|
||||
If your PR is merged in, the maintainers will update the release notes with a summary of your contribution and credit you as the author. You do not need to add this yourself in the PR (please don't, as it may cause merge conflicts), but you can include a suggested note in the PR description if you like.
|
||||
When your changes include anything user-facing, **add a summary to the `## Unreleased` section** at the top of that file under the appropriate heading:
|
||||
|
||||
- **Features** — new user-facing capabilities
|
||||
- **Bug Fixes** — corrections to existing behavior
|
||||
- **Improvements** — enhancements, refactors, docs, or dependency updates
|
||||
|
||||
Use the format `- **Area**: Description` to stay consistent with existing entries.
|
||||
|
||||
**Example:**
|
||||
```markdown
|
||||
## Unreleased
|
||||
|
||||
### Features
|
||||
- **Maps**: Added support for downloading South America regional maps
|
||||
|
||||
### Bug Fixes
|
||||
- **AI Chat**: Fixed document upload failing on filenames with special characters
|
||||
```
|
||||
|
||||
> When a release is triggered, CI automatically stamps the version and date, commits the update, and publishes the content to the GitHub release. You do not need to do this manually.
|
||||
|
||||
---
|
||||
|
||||
|
|
@ -146,7 +165,7 @@ This project uses [Semantic Versioning](https://semver.org/). Versions are manag
|
|||
```bash
|
||||
git push origin your-branch-name
|
||||
```
|
||||
2. Open a pull request against the `dev` branch of this repository
|
||||
2. Open a pull request against the `main` branch of this repository
|
||||
3. In the PR description:
|
||||
- Summarize what your changes do and why
|
||||
- Reference the related issue (e.g., `Closes #123`)
|
||||
|
|
|
|||
10
FAQ.md
|
|
@ -12,12 +12,6 @@ Note: As of 3/24/2026, only the core services defined in the `docker-compose.yml
|
|||
|
||||
Yes, you can customize the storage location for NOMAD's content by modifying the `docker-compose.yml` file to adjust the appropriate bind mounts to point to your desired storage location on your host machine. Please refer to the [Advanced Installation](README.md#advanced-installation) section of the README for more details on how to do this.
|
||||
|
||||
## Can I store NOMAD's data on an external drive or network storage?
|
||||
|
||||
Short answer: yes, but we can't do it for you (and we recommend a local drive for best performance).
|
||||
|
||||
Long answer: Custom storage paths, mount points, and external drives (like iSCSI or SMB/NFS volumes) **are possible**, but this will be up to your individual configuration on the host before NOMAD starts, and then passed in via the compose.yml as this is a *host-level concern*, not a NOMAD-level concern (see above for details). NOMAD itself can't configure this for you, nor could we support all possible configurations in the install script.
|
||||
|
||||
## Can I run NOMAD on MAC, WSL2, or a non-Debian-based Distro?
|
||||
|
||||
See [Why does NOMAD require a Debian-based OS?](#why-does-nomad-require-a-debian-based-os)
|
||||
|
|
@ -65,10 +59,6 @@ All of NOMAD's containers are prefixed with `nomad_` in their names, so they can
|
|||
|
||||
See [What technologies is NOMAD built with?](#what-technologies-is-nomad-built-with)
|
||||
|
||||
## Can I use any AI models?
|
||||
NOMAD by default uses Ollama inside of a docker container to run LLM Models for the AI Assistant. So if you find a model on HuggingFace for example, you won't be able to use that model in NOMAD. The list of available models in the AI Assistant settings (/settings/models) may not show all of the models you are looking for. If you found a model from https://ollama.com/search that you'd like to try and its not in the settings page, you can use a curl command to download the model.
|
||||
`curl -X POST -H "Content-Type: application/json" -d '{"model":"MODEL_NAME_HERE"}' http://localhost:8080/api/ollama/models` replacing MODEL_NAME_HERE with the model name from whats in the ollama website.
|
||||
|
||||
## Do I have to install the AI features in NOMAD?
|
||||
|
||||
No, the AI features in NOMAD (Ollama, Qdrant, custom RAG pipeline, etc.) are all optional and not required to use the core functionality of NOMAD.
|
||||
|
|
|
|||
16
README.md
|
|
@ -1,5 +1,5 @@
|
|||
<div align="center">
|
||||
<img src="admin/public/project_nomad_logo.webp" width="200" height="200"/>
|
||||
<img src="https://raw.githubusercontent.com/Crosstalk-Solutions/project-nomad/refs/heads/main/admin/public/project_nomad_logo.png" width="200" height="200"/>
|
||||
|
||||
# Project N.O.M.A.D.
|
||||
### Node for Offline Media, Archives, and Data
|
||||
|
|
@ -23,11 +23,7 @@ Project N.O.M.A.D. can be installed on any Debian-based operating system (we rec
|
|||
|
||||
### Quick Install (Debian-based OS Only)
|
||||
```bash
|
||||
sudo apt-get update && \
|
||||
sudo apt-get install -y curl && \
|
||||
curl -fsSL https://raw.githubusercontent.com/Crosstalk-Solutions/project-nomad/refs/heads/main/install/install_nomad.sh \
|
||||
-o install_nomad.sh && \
|
||||
sudo bash install_nomad.sh
|
||||
sudo apt-get update && sudo apt-get install -y curl && curl -fsSL https://raw.githubusercontent.com/Crosstalk-Solutions/project-nomad/refs/heads/main/install/install_nomad.sh -o install_nomad.sh && sudo bash install_nomad.sh
|
||||
```
|
||||
|
||||
Project N.O.M.A.D. is now installed on your device! Open a browser and navigate to `http://localhost:8080` (or `http://DEVICE_IP:8080`) to start exploring!
|
||||
|
|
@ -41,7 +37,7 @@ For more control over the installation process, copy and paste the [Docker Compo
|
|||
N.O.M.A.D. is a management UI ("Command Center") and API that orchestrates a collection of containerized tools and resources via [Docker](https://www.docker.com/). It handles installation, configuration, and updates for everything — so you don't have to.
|
||||
|
||||
**Built-in capabilities include:**
|
||||
- **AI Chat with Knowledge Base** — local AI chat powered by [Ollama](https://ollama.com/) or you can use OpenAI API compatible software such as LM Studio or llama.cpp, with document upload and semantic search (RAG via [Qdrant](https://qdrant.tech/))
|
||||
- **AI Chat with Knowledge Base** — local AI chat powered by [Ollama](https://ollama.com/), with document upload and semantic search (RAG via [Qdrant](https://qdrant.tech/))
|
||||
- **Information Library** — offline Wikipedia, medical references, ebooks, and more via [Kiwix](https://kiwix.org/)
|
||||
- **Education Platform** — Khan Academy courses with progress tracking via [Kolibri](https://learningequality.org/kolibri/)
|
||||
- **Offline Maps** — downloadable regional maps via [ProtoMaps](https://protomaps.com)
|
||||
|
|
@ -93,12 +89,6 @@ To run LLM's and other included AI tools:
|
|||
|
||||
Again, Project N.O.M.A.D. itself is quite lightweight - it's the tools and resources you choose to install with N.O.M.A.D. that will determine the specs required for your unique deployment
|
||||
|
||||
#### Running AI models on a different host
|
||||
By default, N.O.M.A.D.'s installer will attempt to setup Ollama on the host when the AI Assistant is installed. However, if you would like to run the AI model on a different host, you can go to the settings of of the AI assistant and input a URL for either an ollama or OpenAI-compatible API server (such as LM Studio).
|
||||
Note that if you use Ollama on a different host, you must start the server with this option `OLLAMA_HOST=0.0.0.0`.
|
||||
Ollama is the preferred way to use the AI assistant as it has features such as model download that OpenAI API does not support. So when using LM Studio for example, you will have to use LM Studio to download models.
|
||||
You are responsible for the setup of Ollama/OpenAI server on the other host.
|
||||
|
||||
## Frequently Asked Questions (FAQ)
|
||||
For answers to common questions about Project N.O.M.A.D., please see our [FAQ](FAQ.md) page.
|
||||
|
||||
|
|
|
|||
|
|
@ -53,8 +53,7 @@ export default defineConfig({
|
|||
() => import('@adonisjs/lucid/database_provider'),
|
||||
() => import('@adonisjs/inertia/inertia_provider'),
|
||||
() => import('@adonisjs/transmit/transmit_provider'),
|
||||
() => import('#providers/map_static_provider'),
|
||||
() => import('#providers/kiwix_migration_provider'),
|
||||
() => import('#providers/map_static_provider')
|
||||
],
|
||||
|
||||
/*
|
||||
|
|
|
|||
|
|
@ -20,8 +20,4 @@ export default class DownloadsController {
|
|||
await this.downloadService.removeFailedJob(params.jobId)
|
||||
return { success: true }
|
||||
}
|
||||
|
||||
async cancelJob({ params }: HttpContext) {
|
||||
return this.downloadService.cancelJob(params.jobId)
|
||||
}
|
||||
}
|
||||
|
|
|
|||
|
|
@ -1,7 +1,6 @@
|
|||
import { SystemService } from '#services/system_service'
|
||||
import { ZimService } from '#services/zim_service'
|
||||
import { CollectionManifestService } from '#services/collection_manifest_service'
|
||||
import KVStore from '#models/kv_store'
|
||||
import { inject } from '@adonisjs/core'
|
||||
import type { HttpContext } from '@adonisjs/core/http'
|
||||
|
||||
|
|
@ -13,14 +12,10 @@ export default class EasySetupController {
|
|||
) {}
|
||||
|
||||
async index({ inertia }: HttpContext) {
|
||||
const [services, remoteOllamaUrl] = await Promise.all([
|
||||
this.systemService.getServices({ installedOnly: false }),
|
||||
KVStore.getValue('ai.remoteOllamaUrl'),
|
||||
])
|
||||
const services = await this.systemService.getServices({ installedOnly: false })
|
||||
return inertia.render('easy-setup/index', {
|
||||
system: {
|
||||
services: services,
|
||||
remoteOllamaUrl: remoteOllamaUrl ?? '',
|
||||
},
|
||||
})
|
||||
}
|
||||
|
|
|
|||
|
|
@ -1,5 +1,4 @@
|
|||
import { MapService } from '#services/map_service'
|
||||
import MapMarker from '#models/map_marker'
|
||||
import {
|
||||
assertNotPrivateUrl,
|
||||
downloadCollectionValidator,
|
||||
|
|
@ -9,7 +8,6 @@ import {
|
|||
} from '#validators/common'
|
||||
import { inject } from '@adonisjs/core'
|
||||
import type { HttpContext } from '@adonisjs/core/http'
|
||||
import vine from '@vinejs/vine'
|
||||
|
||||
@inject()
|
||||
export default class MapsController {
|
||||
|
|
@ -75,18 +73,6 @@ export default class MapsController {
|
|||
return await this.mapService.listRegions()
|
||||
}
|
||||
|
||||
async globalMapInfo({}: HttpContext) {
|
||||
return await this.mapService.getGlobalMapInfo()
|
||||
}
|
||||
|
||||
async downloadGlobalMap({}: HttpContext) {
|
||||
const result = await this.mapService.downloadGlobalMap()
|
||||
return {
|
||||
message: 'Download started successfully',
|
||||
...result,
|
||||
}
|
||||
}
|
||||
|
||||
async styles({ request, response }: HttpContext) {
|
||||
// Automatically ensure base assets are present before generating styles
|
||||
const baseAssetsExist = await this.mapService.ensureBaseAssets()
|
||||
|
|
@ -97,13 +83,7 @@ export default class MapsController {
|
|||
})
|
||||
}
|
||||
|
||||
const forwardedProto = request.headers()['x-forwarded-proto'];
|
||||
|
||||
const protocol: string = forwardedProto
|
||||
? (typeof forwardedProto === 'string' ? forwardedProto : request.protocol())
|
||||
: request.protocol();
|
||||
|
||||
const styles = await this.mapService.generateStylesJSON(request.host(), protocol)
|
||||
const styles = await this.mapService.generateStylesJSON(request.host(), request.protocol())
|
||||
return response.json(styles)
|
||||
}
|
||||
|
||||
|
|
@ -125,60 +105,4 @@ export default class MapsController {
|
|||
message: 'Map file deleted successfully',
|
||||
}
|
||||
}
|
||||
|
||||
// --- Map Markers ---
|
||||
|
||||
async listMarkers({}: HttpContext) {
|
||||
return await MapMarker.query().orderBy('created_at', 'asc')
|
||||
}
|
||||
|
||||
async createMarker({ request }: HttpContext) {
|
||||
const payload = await request.validateUsing(
|
||||
vine.compile(
|
||||
vine.object({
|
||||
name: vine.string().trim().minLength(1).maxLength(255),
|
||||
longitude: vine.number(),
|
||||
latitude: vine.number(),
|
||||
color: vine.string().trim().maxLength(20).optional(),
|
||||
})
|
||||
)
|
||||
)
|
||||
const marker = await MapMarker.create({
|
||||
name: payload.name,
|
||||
longitude: payload.longitude,
|
||||
latitude: payload.latitude,
|
||||
color: payload.color ?? 'orange',
|
||||
})
|
||||
return marker
|
||||
}
|
||||
|
||||
async updateMarker({ request, response }: HttpContext) {
|
||||
const { id } = request.params()
|
||||
const marker = await MapMarker.find(id)
|
||||
if (!marker) {
|
||||
return response.status(404).send({ message: 'Marker not found' })
|
||||
}
|
||||
const payload = await request.validateUsing(
|
||||
vine.compile(
|
||||
vine.object({
|
||||
name: vine.string().trim().minLength(1).maxLength(255).optional(),
|
||||
color: vine.string().trim().maxLength(20).optional(),
|
||||
})
|
||||
)
|
||||
)
|
||||
if (payload.name !== undefined) marker.name = payload.name
|
||||
if (payload.color !== undefined) marker.color = payload.color
|
||||
await marker.save()
|
||||
return marker
|
||||
}
|
||||
|
||||
async deleteMarker({ request, response }: HttpContext) {
|
||||
const { id } = request.params()
|
||||
const marker = await MapMarker.find(id)
|
||||
if (!marker) {
|
||||
return response.status(404).send({ message: 'Marker not found' })
|
||||
}
|
||||
await marker.delete()
|
||||
return { message: 'Marker deleted' }
|
||||
}
|
||||
}
|
||||
|
|
|
|||
|
|
@ -1,23 +1,18 @@
|
|||
import { ChatService } from '#services/chat_service'
|
||||
import { DockerService } from '#services/docker_service'
|
||||
import { OllamaService } from '#services/ollama_service'
|
||||
import { RagService } from '#services/rag_service'
|
||||
import Service from '#models/service'
|
||||
import KVStore from '#models/kv_store'
|
||||
import { modelNameSchema } from '#validators/download'
|
||||
import { chatSchema, getAvailableModelsSchema } from '#validators/ollama'
|
||||
import { inject } from '@adonisjs/core'
|
||||
import type { HttpContext } from '@adonisjs/core/http'
|
||||
import { DEFAULT_QUERY_REWRITE_MODEL, RAG_CONTEXT_LIMITS, SYSTEM_PROMPTS } from '../../constants/ollama.js'
|
||||
import { SERVICE_NAMES } from '../../constants/service_names.js'
|
||||
import logger from '@adonisjs/core/services/logger'
|
||||
type Message = { role: 'system' | 'user' | 'assistant'; content: string }
|
||||
import type { Message } from 'ollama'
|
||||
|
||||
@inject()
|
||||
export default class OllamaController {
|
||||
constructor(
|
||||
private chatService: ChatService,
|
||||
private dockerService: DockerService,
|
||||
private ollamaService: OllamaService,
|
||||
private ragService: RagService
|
||||
) { }
|
||||
|
|
@ -77,10 +72,10 @@ export default class OllamaController {
|
|||
const { maxResults, maxTokens } = this.getContextLimitsForModel(reqData.model)
|
||||
let trimmedDocs = relevantDocs.slice(0, maxResults)
|
||||
|
||||
// Apply token cap if set (estimate ~3.5 chars per token)
|
||||
// Apply token cap if set (estimate ~4 chars per token)
|
||||
// Always include the first (most relevant) result — the cap only gates subsequent results
|
||||
if (maxTokens > 0) {
|
||||
const charCap = maxTokens * 3.5
|
||||
const charCap = maxTokens * 4
|
||||
let totalChars = 0
|
||||
trimmedDocs = trimmedDocs.filter((doc, idx) => {
|
||||
totalChars += doc.text.length
|
||||
|
|
@ -108,19 +103,6 @@ export default class OllamaController {
|
|||
}
|
||||
}
|
||||
|
||||
// If system messages are large (e.g. due to RAG context), request a context window big
|
||||
// enough to fit them. Ollama respects num_ctx per-request; LM Studio ignores it gracefully.
|
||||
const systemChars = reqData.messages
|
||||
.filter((m) => m.role === 'system')
|
||||
.reduce((sum, m) => sum + m.content.length, 0)
|
||||
const estimatedSystemTokens = Math.ceil(systemChars / 3.5)
|
||||
let numCtx: number | undefined
|
||||
if (estimatedSystemTokens > 3000) {
|
||||
const needed = estimatedSystemTokens + 2048 // leave room for conversation + response
|
||||
numCtx = [8192, 16384, 32768, 65536].find((n) => n >= needed) ?? 65536
|
||||
logger.debug(`[OllamaController] Large system prompt (~${estimatedSystemTokens} tokens), requesting num_ctx: ${numCtx}`)
|
||||
}
|
||||
|
||||
// Check if the model supports "thinking" capability for enhanced response generation
|
||||
// If gpt-oss model, it requires a text param for "think" https://docs.ollama.com/api/chat
|
||||
const thinkingCapability = await this.ollamaService.checkModelHasThinking(reqData.model)
|
||||
|
|
@ -142,7 +124,7 @@ export default class OllamaController {
|
|||
if (reqData.stream) {
|
||||
logger.debug(`[OllamaController] Initiating streaming response for model: "${reqData.model}" with think: ${think}`)
|
||||
// Headers already flushed above
|
||||
const stream = await this.ollamaService.chatStream({ ...ollamaRequest, think, numCtx })
|
||||
const stream = await this.ollamaService.chatStream({ ...ollamaRequest, think })
|
||||
let fullContent = ''
|
||||
for await (const chunk of stream) {
|
||||
if (chunk.message?.content) {
|
||||
|
|
@ -166,7 +148,7 @@ export default class OllamaController {
|
|||
}
|
||||
|
||||
// Non-streaming (legacy) path
|
||||
const result = await this.ollamaService.chat({ ...ollamaRequest, think, numCtx })
|
||||
const result = await this.ollamaService.chat({ ...ollamaRequest, think })
|
||||
|
||||
if (sessionId && result?.message?.content) {
|
||||
await this.chatService.addMessage(sessionId, 'assistant', result.message.content)
|
||||
|
|
@ -189,87 +171,6 @@ export default class OllamaController {
|
|||
}
|
||||
}
|
||||
|
||||
async remoteStatus() {
|
||||
const remoteUrl = await KVStore.getValue('ai.remoteOllamaUrl')
|
||||
if (!remoteUrl) {
|
||||
return { configured: false, connected: false }
|
||||
}
|
||||
try {
|
||||
const testResponse = await fetch(`${remoteUrl.replace(/\/$/, '')}/v1/models`, {
|
||||
signal: AbortSignal.timeout(3000),
|
||||
})
|
||||
return { configured: true, connected: testResponse.ok }
|
||||
} catch {
|
||||
return { configured: true, connected: false }
|
||||
}
|
||||
}
|
||||
|
||||
async configureRemote({ request, response }: HttpContext) {
|
||||
const remoteUrl: string | null = request.input('remoteUrl', null)
|
||||
|
||||
const ollamaService = await Service.query().where('service_name', SERVICE_NAMES.OLLAMA).first()
|
||||
if (!ollamaService) {
|
||||
return response.status(404).send({ success: false, message: 'Ollama service record not found.' })
|
||||
}
|
||||
|
||||
// Clear path: null or empty URL removes remote config and marks service as not installed
|
||||
if (!remoteUrl || remoteUrl.trim() === '') {
|
||||
await KVStore.clearValue('ai.remoteOllamaUrl')
|
||||
ollamaService.installed = false
|
||||
ollamaService.installation_status = 'idle'
|
||||
await ollamaService.save()
|
||||
return { success: true, message: 'Remote Ollama configuration cleared.' }
|
||||
}
|
||||
|
||||
// Validate URL format
|
||||
if (!remoteUrl.startsWith('http')) {
|
||||
return response.status(400).send({
|
||||
success: false,
|
||||
message: 'Invalid URL. Must start with http:// or https://',
|
||||
})
|
||||
}
|
||||
|
||||
// Test connectivity via OpenAI-compatible /v1/models endpoint (works with Ollama, LM Studio, llama.cpp, etc.)
|
||||
try {
|
||||
const testResponse = await fetch(`${remoteUrl.replace(/\/$/, '')}/v1/models`, {
|
||||
signal: AbortSignal.timeout(5000),
|
||||
})
|
||||
if (!testResponse.ok) {
|
||||
return response.status(400).send({
|
||||
success: false,
|
||||
message: `Could not connect to ${remoteUrl} (HTTP ${testResponse.status}). Make sure the server is running and accessible. For Ollama, start it with OLLAMA_HOST=0.0.0.0.`,
|
||||
})
|
||||
}
|
||||
} catch (error) {
|
||||
return response.status(400).send({
|
||||
success: false,
|
||||
message: `Could not connect to ${remoteUrl}. Make sure the server is running and reachable. For Ollama, start it with OLLAMA_HOST=0.0.0.0.`,
|
||||
})
|
||||
}
|
||||
|
||||
// Save remote URL and mark service as installed
|
||||
await KVStore.setValue('ai.remoteOllamaUrl', remoteUrl.trim())
|
||||
ollamaService.installed = true
|
||||
ollamaService.installation_status = 'idle'
|
||||
await ollamaService.save()
|
||||
|
||||
// Install Qdrant if not already installed (fire-and-forget)
|
||||
const qdrantService = await Service.query().where('service_name', SERVICE_NAMES.QDRANT).first()
|
||||
if (qdrantService && !qdrantService.installed) {
|
||||
this.dockerService.createContainerPreflight(SERVICE_NAMES.QDRANT).catch((error) => {
|
||||
logger.error('[OllamaController] Failed to start Qdrant preflight:', error)
|
||||
})
|
||||
}
|
||||
|
||||
// Mirror post-install side effects: disable suggestions, trigger docs discovery
|
||||
await KVStore.setValue('chat.suggestionsEnabled', false)
|
||||
this.ragService.discoverNomadDocs().catch((error) => {
|
||||
logger.error('[OllamaController] Failed to discover Nomad docs:', error)
|
||||
})
|
||||
|
||||
return { success: true, message: 'Remote Ollama configured.' }
|
||||
}
|
||||
|
||||
async deleteModel({ request }: HttpContext) {
|
||||
const reqData = await request.validateUsing(modelNameSchema)
|
||||
await this.ollamaService.deleteModel(reqData.model)
|
||||
|
|
|
|||
|
|
@ -74,19 +74,6 @@ export default class RagController {
|
|||
return response.status(200).json({ message: result.message })
|
||||
}
|
||||
|
||||
public async getFailedJobs({ response }: HttpContext) {
|
||||
const jobs = await EmbedFileJob.listFailedJobs()
|
||||
return response.status(200).json(jobs)
|
||||
}
|
||||
|
||||
public async cleanupFailedJobs({ response }: HttpContext) {
|
||||
const result = await EmbedFileJob.cleanupFailedJobs()
|
||||
return response.status(200).json({
|
||||
message: `Cleaned up ${result.cleaned} failed job${result.cleaned !== 1 ? 's' : ''}${result.filesDeleted > 0 ? `, deleted ${result.filesDeleted} file${result.filesDeleted !== 1 ? 's' : ''}` : ''}.`,
|
||||
...result,
|
||||
})
|
||||
}
|
||||
|
||||
public async scanAndSync({ response }: HttpContext) {
|
||||
try {
|
||||
const syncResult = await this.ragService.scanAndSyncStorage()
|
||||
|
|
|
|||
|
|
@ -1,11 +1,12 @@
|
|||
import KVStore from '#models/kv_store'
|
||||
import { BenchmarkService } from '#services/benchmark_service'
|
||||
import { MapService } from '#services/map_service'
|
||||
import { OllamaService } from '#services/ollama_service'
|
||||
import { SystemService } from '#services/system_service'
|
||||
import { getSettingSchema, updateSettingSchema } from '#validators/settings'
|
||||
import { inject } from '@adonisjs/core'
|
||||
import KVStore from '#models/kv_store';
|
||||
import { BenchmarkService } from '#services/benchmark_service';
|
||||
import { MapService } from '#services/map_service';
|
||||
import { OllamaService } from '#services/ollama_service';
|
||||
import { SystemService } from '#services/system_service';
|
||||
import { updateSettingSchema } from '#validators/settings';
|
||||
import { inject } from '@adonisjs/core';
|
||||
import type { HttpContext } from '@adonisjs/core/http'
|
||||
import type { KVStoreKey } from '../../types/kv_store.js';
|
||||
|
||||
@inject()
|
||||
export default class SettingsController {
|
||||
|
|
@ -17,54 +18,47 @@ export default class SettingsController {
|
|||
) { }
|
||||
|
||||
async system({ inertia }: HttpContext) {
|
||||
const systemInfo = await this.systemService.getSystemInfo()
|
||||
const systemInfo = await this.systemService.getSystemInfo();
|
||||
return inertia.render('settings/system', {
|
||||
system: {
|
||||
info: systemInfo,
|
||||
},
|
||||
})
|
||||
info: systemInfo
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
async apps({ inertia }: HttpContext) {
|
||||
const services = await this.systemService.getServices({ installedOnly: false })
|
||||
const services = await this.systemService.getServices({ installedOnly: false });
|
||||
return inertia.render('settings/apps', {
|
||||
system: {
|
||||
services,
|
||||
},
|
||||
})
|
||||
services
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
async legal({ inertia }: HttpContext) {
|
||||
return inertia.render('settings/legal')
|
||||
return inertia.render('settings/legal');
|
||||
}
|
||||
|
||||
async support({ inertia }: HttpContext) {
|
||||
return inertia.render('settings/support')
|
||||
return inertia.render('settings/support');
|
||||
}
|
||||
|
||||
async maps({ inertia }: HttpContext) {
|
||||
const baseAssetsCheck = await this.mapService.ensureBaseAssets()
|
||||
const regionFiles = await this.mapService.listRegions()
|
||||
const baseAssetsCheck = await this.mapService.ensureBaseAssets();
|
||||
const regionFiles = await this.mapService.listRegions();
|
||||
return inertia.render('settings/maps', {
|
||||
maps: {
|
||||
baseAssetsExist: baseAssetsCheck,
|
||||
regionFiles: regionFiles.files,
|
||||
},
|
||||
})
|
||||
regionFiles: regionFiles.files
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
async models({ inertia }: HttpContext) {
|
||||
const availableModels = await this.ollamaService.getAvailableModels({
|
||||
sort: 'pulls',
|
||||
recommendedOnly: false,
|
||||
query: null,
|
||||
limit: 15,
|
||||
})
|
||||
const installedModels = await this.ollamaService.getModels().catch(() => [])
|
||||
const availableModels = await this.ollamaService.getAvailableModels({ sort: 'pulls', recommendedOnly: false, query: null, limit: 15 });
|
||||
const installedModels = await this.ollamaService.getModels();
|
||||
const chatSuggestionsEnabled = await KVStore.getValue('chat.suggestionsEnabled')
|
||||
const aiAssistantCustomName = await KVStore.getValue('ai.assistantCustomName')
|
||||
const remoteOllamaUrl = await KVStore.getValue('ai.remoteOllamaUrl')
|
||||
const ollamaFlashAttention = await KVStore.getValue('ai.ollamaFlashAttention')
|
||||
return inertia.render('settings/models', {
|
||||
models: {
|
||||
availableModels: availableModels?.models || [],
|
||||
|
|
@ -72,22 +66,20 @@ export default class SettingsController {
|
|||
settings: {
|
||||
chatSuggestionsEnabled: chatSuggestionsEnabled ?? false,
|
||||
aiAssistantCustomName: aiAssistantCustomName ?? '',
|
||||
remoteOllamaUrl: remoteOllamaUrl ?? '',
|
||||
ollamaFlashAttention: ollamaFlashAttention ?? true,
|
||||
},
|
||||
},
|
||||
})
|
||||
}
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
async update({ inertia }: HttpContext) {
|
||||
const updateInfo = await this.systemService.checkLatestVersion()
|
||||
const updateInfo = await this.systemService.checkLatestVersion();
|
||||
return inertia.render('settings/update', {
|
||||
system: {
|
||||
updateAvailable: updateInfo.updateAvailable,
|
||||
latestVersion: updateInfo.latestVersion,
|
||||
currentVersion: updateInfo.currentVersion,
|
||||
},
|
||||
})
|
||||
currentVersion: updateInfo.currentVersion
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
async zim({ inertia }: HttpContext) {
|
||||
|
|
@ -95,30 +87,30 @@ export default class SettingsController {
|
|||
}
|
||||
|
||||
async zimRemote({ inertia }: HttpContext) {
|
||||
return inertia.render('settings/zim/remote-explorer')
|
||||
return inertia.render('settings/zim/remote-explorer');
|
||||
}
|
||||
|
||||
async benchmark({ inertia }: HttpContext) {
|
||||
const latestResult = await this.benchmarkService.getLatestResult()
|
||||
const status = this.benchmarkService.getStatus()
|
||||
const latestResult = await this.benchmarkService.getLatestResult();
|
||||
const status = this.benchmarkService.getStatus();
|
||||
return inertia.render('settings/benchmark', {
|
||||
benchmark: {
|
||||
latestResult,
|
||||
status: status.status,
|
||||
currentBenchmarkId: status.benchmarkId,
|
||||
},
|
||||
})
|
||||
currentBenchmarkId: status.benchmarkId
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
async getSetting({ request, response }: HttpContext) {
|
||||
const { key } = await getSettingSchema.validate({ key: request.qs().key });
|
||||
const value = await KVStore.getValue(key);
|
||||
const key = request.qs().key;
|
||||
const value = await KVStore.getValue(key as KVStoreKey);
|
||||
return response.status(200).send({ key, value });
|
||||
}
|
||||
|
||||
async updateSetting({ request, response }: HttpContext) {
|
||||
const reqData = await request.validateUsing(updateSettingSchema)
|
||||
await this.systemService.updateSetting(reqData.key, reqData.value)
|
||||
return response.status(200).send({ success: true, message: 'Setting updated successfully' })
|
||||
const reqData = await request.validateUsing(updateSettingSchema);
|
||||
await this.systemService.updateSetting(reqData.key, reqData.value);
|
||||
return response.status(200).send({ success: true, message: 'Setting updated successfully' });
|
||||
}
|
||||
}
|
||||
|
|
@ -27,7 +27,7 @@ export default class ZimController {
|
|||
async downloadRemote({ request }: HttpContext) {
|
||||
const payload = await request.validateUsing(remoteDownloadWithMetadataValidator)
|
||||
assertNotPrivateUrl(payload.url)
|
||||
const { filename, jobId } = await this.zimService.downloadRemote(payload.url, payload.metadata)
|
||||
const { filename, jobId } = await this.zimService.downloadRemote(payload.url)
|
||||
|
||||
return {
|
||||
message: 'Download started successfully',
|
||||
|
|
|
|||
|
|
@ -44,9 +44,7 @@ export class DownloadModelJob {
|
|||
// Services are ready, initiate the download with progress tracking
|
||||
const result = await ollamaService.downloadModel(modelName, (progressPercent) => {
|
||||
if (progressPercent) {
|
||||
job.updateProgress(Math.floor(progressPercent)).catch((err) => {
|
||||
if (err?.code !== -1) throw err
|
||||
})
|
||||
job.updateProgress(Math.floor(progressPercent))
|
||||
logger.info(
|
||||
`[DownloadModelJob] Model ${modelName}: ${progressPercent}%`
|
||||
)
|
||||
|
|
@ -58,8 +56,6 @@ export class DownloadModelJob {
|
|||
status: 'downloading',
|
||||
progress: progressPercent,
|
||||
progress_timestamp: new Date().toISOString(),
|
||||
}).catch((err) => {
|
||||
if (err?.code !== -1) throw err
|
||||
})
|
||||
})
|
||||
|
||||
|
|
|
|||
|
|
@ -6,7 +6,6 @@ import { DockerService } from '#services/docker_service'
|
|||
import { OllamaService } from '#services/ollama_service'
|
||||
import { createHash } from 'crypto'
|
||||
import logger from '@adonisjs/core/services/logger'
|
||||
import fs from 'node:fs/promises'
|
||||
|
||||
export interface EmbedFileJobParams {
|
||||
filePath: string
|
||||
|
|
@ -31,17 +30,6 @@ export class EmbedFileJob {
|
|||
return createHash('sha256').update(filePath).digest('hex').slice(0, 16)
|
||||
}
|
||||
|
||||
/** Calls job.updateProgress but silently ignores "Missing key" errors (code -1),
|
||||
* which occur when the job has been removed from Redis (e.g. cancelled externally)
|
||||
* between the time the await was issued and the Redis write completed. */
|
||||
private async safeUpdateProgress(job: Job, progress: number): Promise<void> {
|
||||
try {
|
||||
await job.updateProgress(progress)
|
||||
} catch (err: any) {
|
||||
if (err?.code !== -1) throw err
|
||||
}
|
||||
}
|
||||
|
||||
async handle(job: Job) {
|
||||
const { filePath, fileName, batchOffset, totalArticles } = job.data as EmbedFileJobParams
|
||||
|
||||
|
|
@ -78,7 +66,7 @@ export class EmbedFileJob {
|
|||
logger.info(`[EmbedFileJob] Services ready. Processing file: ${fileName}`)
|
||||
|
||||
// Update progress starting
|
||||
await this.safeUpdateProgress(job, 5)
|
||||
await job.updateProgress(5)
|
||||
await job.updateData({
|
||||
...job.data,
|
||||
status: 'processing',
|
||||
|
|
@ -89,7 +77,7 @@ export class EmbedFileJob {
|
|||
|
||||
// Progress callback: maps service-reported 0-100% into the 5-95% job range
|
||||
const onProgress = async (percent: number) => {
|
||||
await this.safeUpdateProgress(job, Math.min(95, Math.round(5 + percent * 0.9)))
|
||||
await job.updateProgress(Math.min(95, Math.round(5 + percent * 0.9)))
|
||||
}
|
||||
|
||||
// Process and embed the file
|
||||
|
|
@ -128,7 +116,7 @@ export class EmbedFileJob {
|
|||
? Math.round((nextOffset / totalArticles) * 100)
|
||||
: 50
|
||||
|
||||
await this.safeUpdateProgress(job, progress)
|
||||
await job.updateProgress(progress)
|
||||
await job.updateData({
|
||||
...job.data,
|
||||
status: 'batch_completed',
|
||||
|
|
@ -149,7 +137,7 @@ export class EmbedFileJob {
|
|||
|
||||
// Final batch or non-batched file - mark as complete
|
||||
const totalChunks = (job.data.chunks || 0) + (result.chunks || 0)
|
||||
await this.safeUpdateProgress(job, 100)
|
||||
await job.updateProgress(100)
|
||||
await job.updateData({
|
||||
...job.data,
|
||||
status: 'completed',
|
||||
|
|
@ -244,52 +232,6 @@ export class EmbedFileJob {
|
|||
}
|
||||
}
|
||||
|
||||
static async listFailedJobs(): Promise<EmbedJobWithProgress[]> {
|
||||
const queueService = new QueueService()
|
||||
const queue = queueService.getQueue(this.queue)
|
||||
// Jobs that have failed at least once are in 'delayed' (retrying) or terminal 'failed' state.
|
||||
// We identify them by job.data.status === 'failed' set in the catch block of handle().
|
||||
const jobs = await queue.getJobs(['waiting', 'delayed', 'failed'])
|
||||
|
||||
return jobs
|
||||
.filter((job) => (job.data as any).status === 'failed')
|
||||
.map((job) => ({
|
||||
jobId: job.id!.toString(),
|
||||
fileName: (job.data as EmbedFileJobParams).fileName,
|
||||
filePath: (job.data as EmbedFileJobParams).filePath,
|
||||
progress: 0,
|
||||
status: 'failed',
|
||||
error: (job.data as any).error,
|
||||
}))
|
||||
}
|
||||
|
||||
static async cleanupFailedJobs(): Promise<{ cleaned: number; filesDeleted: number }> {
|
||||
const queueService = new QueueService()
|
||||
const queue = queueService.getQueue(this.queue)
|
||||
const allJobs = await queue.getJobs(['waiting', 'delayed', 'failed'])
|
||||
const failedJobs = allJobs.filter((job) => (job.data as any).status === 'failed')
|
||||
|
||||
let cleaned = 0
|
||||
let filesDeleted = 0
|
||||
|
||||
for (const job of failedJobs) {
|
||||
const filePath = (job.data as EmbedFileJobParams).filePath
|
||||
if (filePath && filePath.includes(RagService.UPLOADS_STORAGE_PATH)) {
|
||||
try {
|
||||
await fs.unlink(filePath)
|
||||
filesDeleted++
|
||||
} catch {
|
||||
// File may already be deleted — that's fine
|
||||
}
|
||||
}
|
||||
await job.remove()
|
||||
cleaned++
|
||||
}
|
||||
|
||||
logger.info(`[EmbedFileJob] Cleaned up ${cleaned} failed jobs, deleted ${filesDeleted} files`)
|
||||
return { cleaned, filesDeleted }
|
||||
}
|
||||
|
||||
static async getStatus(filePath: string): Promise<{
|
||||
exists: boolean
|
||||
status?: string
|
||||
|
|
|
|||
|
|
@ -1,5 +1,5 @@
|
|||
import { Job, UnrecoverableError } from 'bullmq'
|
||||
import { RunDownloadJobParams, DownloadProgressData } from '../../types/downloads.js'
|
||||
import { Job } from 'bullmq'
|
||||
import { RunDownloadJobParams } from '../../types/downloads.js'
|
||||
import { QueueService } from '#services/queue_service'
|
||||
import { doResumableDownload } from '../utils/downloads.js'
|
||||
import { createHash } from 'crypto'
|
||||
|
|
@ -17,85 +17,23 @@ export class RunDownloadJob {
|
|||
return 'run-download'
|
||||
}
|
||||
|
||||
/** In-memory registry of abort controllers for active download jobs */
|
||||
static abortControllers: Map<string, AbortController> = new Map()
|
||||
|
||||
static getJobId(url: string): string {
|
||||
return createHash('sha256').update(url).digest('hex').slice(0, 16)
|
||||
}
|
||||
|
||||
/** Redis key used to signal cancellation across processes */
|
||||
static cancelKey(jobId: string): string {
|
||||
return `nomad:download:cancel:${jobId}`
|
||||
}
|
||||
|
||||
/** Signal cancellation via Redis so the worker process can pick it up */
|
||||
static async signalCancel(jobId: string): Promise<void> {
|
||||
const queueService = new QueueService()
|
||||
const queue = queueService.getQueue(this.queue)
|
||||
const client = await queue.client
|
||||
await client.set(this.cancelKey(jobId), '1', 'EX', 300) // 5 min TTL
|
||||
}
|
||||
|
||||
async handle(job: Job) {
|
||||
const { url, filepath, timeout, allowedMimeTypes, forceNew, filetype, resourceMetadata } =
|
||||
job.data as RunDownloadJobParams
|
||||
|
||||
// Register abort controller for this job
|
||||
const abortController = new AbortController()
|
||||
RunDownloadJob.abortControllers.set(job.id!, abortController)
|
||||
|
||||
// Get Redis client for checking cancel signals from the API process
|
||||
const queueService = new QueueService()
|
||||
const cancelRedis = await queueService.getQueue(RunDownloadJob.queue).client
|
||||
|
||||
let lastKnownProgress: Pick<DownloadProgressData, 'downloadedBytes' | 'totalBytes'> = {
|
||||
downloadedBytes: 0,
|
||||
totalBytes: 0,
|
||||
}
|
||||
|
||||
// Track whether cancellation was explicitly requested by the user (via Redis signal
|
||||
// or in-process AbortController). BullMQ lock mismatches can also abort the download
|
||||
// stream, but those should be retried — only user-initiated cancels are unrecoverable.
|
||||
let userCancelled = false
|
||||
|
||||
// Poll Redis for cancel signal every 2s — independent of progress events so cancellation
|
||||
// works even when the stream is stalled and no onProgress ticks are firing.
|
||||
let cancelPollInterval: ReturnType<typeof setInterval> | null = setInterval(async () => {
|
||||
try {
|
||||
const val = await cancelRedis.get(RunDownloadJob.cancelKey(job.id!))
|
||||
if (val) {
|
||||
await cancelRedis.del(RunDownloadJob.cancelKey(job.id!))
|
||||
userCancelled = true
|
||||
abortController.abort('user-cancel')
|
||||
}
|
||||
} catch {
|
||||
// Redis errors are non-fatal; in-process AbortController covers same-process cancels
|
||||
}
|
||||
}, 2000)
|
||||
|
||||
try {
|
||||
await doResumableDownload({
|
||||
url,
|
||||
filepath,
|
||||
timeout,
|
||||
allowedMimeTypes,
|
||||
forceNew,
|
||||
signal: abortController.signal,
|
||||
onProgress(progress) {
|
||||
const progressPercent = (progress.downloadedBytes / (progress.totalBytes || 1)) * 100
|
||||
const progressData: DownloadProgressData = {
|
||||
percent: Math.floor(progressPercent),
|
||||
downloadedBytes: progress.downloadedBytes,
|
||||
totalBytes: progress.totalBytes,
|
||||
lastProgressTime: Date.now(),
|
||||
}
|
||||
job.updateProgress(progressData).catch((err) => {
|
||||
// Job was removed from Redis (e.g. cancelled) between the callback firing
|
||||
// and the Redis write completing — this is expected and safe to ignore.
|
||||
if (err?.code !== -1) throw err
|
||||
})
|
||||
lastKnownProgress = { downloadedBytes: progress.downloadedBytes, totalBytes: progress.totalBytes }
|
||||
job.updateProgress(Math.floor(progressPercent))
|
||||
},
|
||||
async onComplete(url) {
|
||||
try {
|
||||
|
|
@ -166,14 +104,7 @@ export class RunDownloadJob {
|
|||
error
|
||||
)
|
||||
}
|
||||
job.updateProgress({
|
||||
percent: 100,
|
||||
downloadedBytes: lastKnownProgress.downloadedBytes,
|
||||
totalBytes: lastKnownProgress.totalBytes,
|
||||
lastProgressTime: Date.now(),
|
||||
} as DownloadProgressData).catch((err) => {
|
||||
if (err?.code !== -1) throw err
|
||||
})
|
||||
job.updateProgress(100)
|
||||
},
|
||||
})
|
||||
|
||||
|
|
@ -181,21 +112,6 @@ export class RunDownloadJob {
|
|||
url,
|
||||
filepath,
|
||||
}
|
||||
} catch (error: any) {
|
||||
// Only prevent retries for user-initiated cancellations. BullMQ lock mismatches
|
||||
// can also abort the stream, and those should be retried with backoff.
|
||||
// Check both the flag (Redis poll) and abort reason (in-process cancel).
|
||||
if (userCancelled || abortController.signal.reason === 'user-cancel') {
|
||||
throw new UnrecoverableError(`Download cancelled: ${error.message}`)
|
||||
}
|
||||
throw error
|
||||
} finally {
|
||||
if (cancelPollInterval !== null) {
|
||||
clearInterval(cancelPollInterval)
|
||||
cancelPollInterval = null
|
||||
}
|
||||
RunDownloadJob.abortControllers.delete(job.id!)
|
||||
}
|
||||
}
|
||||
|
||||
static async getByUrl(url: string): Promise<Job | undefined> {
|
||||
|
|
@ -205,29 +121,6 @@ export class RunDownloadJob {
|
|||
return await queue.getJob(jobId)
|
||||
}
|
||||
|
||||
/**
|
||||
* Check if a download is actively in progress for the given URL.
|
||||
* Returns the job only if it's in an active state (active, waiting, delayed).
|
||||
* If the job exists in a terminal state (failed, completed), removes it and returns undefined.
|
||||
*/
|
||||
static async getActiveByUrl(url: string): Promise<Job | undefined> {
|
||||
const job = await this.getByUrl(url)
|
||||
if (!job) return undefined
|
||||
|
||||
const state = await job.getState()
|
||||
if (state === 'active' || state === 'waiting' || state === 'delayed') {
|
||||
return job
|
||||
}
|
||||
|
||||
// Terminal state -- clean up stale job so it doesn't block re-download
|
||||
try {
|
||||
await job.remove()
|
||||
} catch {
|
||||
// May already be gone
|
||||
}
|
||||
return undefined
|
||||
}
|
||||
|
||||
static async dispatch(params: RunDownloadJobParams) {
|
||||
const queueService = new QueueService()
|
||||
const queue = queueService.getQueue(this.queue)
|
||||
|
|
@ -236,8 +129,8 @@ export class RunDownloadJob {
|
|||
try {
|
||||
const job = await queue.add(this.key, params, {
|
||||
jobId,
|
||||
attempts: 10,
|
||||
backoff: { type: 'exponential', delay: 30000 },
|
||||
attempts: 3,
|
||||
backoff: { type: 'exponential', delay: 2000 },
|
||||
removeOnComplete: true,
|
||||
})
|
||||
|
||||
|
|
|
|||
|
|
@ -1,21 +0,0 @@
|
|||
import env from '#start/env'
|
||||
import type { HttpContext } from '@adonisjs/core/http'
|
||||
import type { NextFn } from '@adonisjs/core/types/http'
|
||||
import compression from 'compression'
|
||||
|
||||
const compress = env.get('DISABLE_COMPRESSION') ? null : compression()
|
||||
|
||||
export default class CompressionMiddleware {
|
||||
async handle({ request, response }: HttpContext, next: NextFn) {
|
||||
if (!compress) return await next()
|
||||
|
||||
await new Promise<void>((resolve, reject) => {
|
||||
compress(request.request as any, response.response as any, (err?: any) => {
|
||||
if (err) reject(err)
|
||||
else resolve()
|
||||
})
|
||||
})
|
||||
|
||||
await next()
|
||||
}
|
||||
}
|
||||
|
|
@ -1,43 +0,0 @@
|
|||
import { DateTime } from 'luxon'
|
||||
import { BaseModel, column, SnakeCaseNamingStrategy } from '@adonisjs/lucid/orm'
|
||||
|
||||
export default class MapMarker extends BaseModel {
|
||||
static namingStrategy = new SnakeCaseNamingStrategy()
|
||||
|
||||
@column({ isPrimary: true })
|
||||
declare id: number
|
||||
|
||||
@column()
|
||||
declare name: string
|
||||
|
||||
@column()
|
||||
declare longitude: number
|
||||
|
||||
@column()
|
||||
declare latitude: number
|
||||
|
||||
@column()
|
||||
declare color: string
|
||||
|
||||
// 'pin' for user-placed markers, 'waypoint' for route points (future)
|
||||
@column()
|
||||
declare marker_type: string
|
||||
|
||||
// Groups markers into a route (future)
|
||||
@column()
|
||||
declare route_id: string | null
|
||||
|
||||
// Order within a route (future)
|
||||
@column()
|
||||
declare route_order: number | null
|
||||
|
||||
// Optional user notes for a location
|
||||
@column()
|
||||
declare notes: string | null
|
||||
|
||||
@column.dateTime({ autoCreate: true })
|
||||
declare created_at: DateTime
|
||||
|
||||
@column.dateTime({ autoCreate: true, autoUpdate: true })
|
||||
declare updated_at: DateTime
|
||||
}
|
||||
|
|
@ -6,14 +6,12 @@ import transmit from '@adonisjs/transmit/services/main'
|
|||
import { doResumableDownloadWithRetry } from '../utils/downloads.js'
|
||||
import { join } from 'path'
|
||||
import { ZIM_STORAGE_PATH } from '../utils/fs.js'
|
||||
import { KiwixLibraryService } from './kiwix_library_service.js'
|
||||
import { SERVICE_NAMES } from '../../constants/service_names.js'
|
||||
import { exec } from 'child_process'
|
||||
import { promisify } from 'util'
|
||||
// import { readdir } from 'fs/promises'
|
||||
import KVStore from '#models/kv_store'
|
||||
import { BROADCAST_CHANNELS } from '../../constants/broadcast.js'
|
||||
import { KIWIX_LIBRARY_CMD } from '../../constants/kiwix.js'
|
||||
|
||||
@inject()
|
||||
export class DockerService {
|
||||
|
|
@ -21,9 +19,6 @@ export class DockerService {
|
|||
private activeInstallations: Set<string> = new Set()
|
||||
public static NOMAD_NETWORK = 'project-nomad_default'
|
||||
|
||||
private _servicesStatusCache: { data: { service_name: string; status: string }[]; expiresAt: number } | null = null
|
||||
private _servicesStatusInflight: Promise<{ service_name: string; status: string }[]> | null = null
|
||||
|
||||
constructor() {
|
||||
// Support both Linux (production) and Windows (development with Docker Desktop)
|
||||
const isWindows = process.platform === 'win32'
|
||||
|
|
@ -61,7 +56,6 @@ export class DockerService {
|
|||
const dockerContainer = this.docker.getContainer(container.Id)
|
||||
if (action === 'stop') {
|
||||
await dockerContainer.stop()
|
||||
this.invalidateServicesStatusCache()
|
||||
return {
|
||||
success: true,
|
||||
message: `Service ${serviceName} stopped successfully`,
|
||||
|
|
@ -69,18 +63,7 @@ export class DockerService {
|
|||
}
|
||||
|
||||
if (action === 'restart') {
|
||||
if (serviceName === SERVICE_NAMES.KIWIX) {
|
||||
const isLegacy = await this.isKiwixOnLegacyConfig()
|
||||
if (isLegacy) {
|
||||
logger.info('[DockerService] Kiwix on legacy glob config — running migration instead of restart.')
|
||||
await this.migrateKiwixToLibraryMode()
|
||||
this.invalidateServicesStatusCache()
|
||||
return { success: true, message: 'Kiwix migrated to library mode successfully.' }
|
||||
}
|
||||
}
|
||||
|
||||
await dockerContainer.restart()
|
||||
this.invalidateServicesStatusCache()
|
||||
|
||||
return {
|
||||
success: true,
|
||||
|
|
@ -97,7 +80,6 @@ export class DockerService {
|
|||
}
|
||||
|
||||
await dockerContainer.start()
|
||||
this.invalidateServicesStatusCache()
|
||||
|
||||
return {
|
||||
success: true,
|
||||
|
|
@ -109,7 +91,7 @@ export class DockerService {
|
|||
success: false,
|
||||
message: `Invalid action: ${action}. Use 'start', 'stop', or 'restart'.`,
|
||||
}
|
||||
} catch (error: any) {
|
||||
} catch (error) {
|
||||
logger.error(`Error starting service ${serviceName}: ${error.message}`)
|
||||
return {
|
||||
success: false,
|
||||
|
|
@ -120,37 +102,13 @@ export class DockerService {
|
|||
|
||||
/**
|
||||
* Fetches the status of all Docker containers related to Nomad services. (those prefixed with 'nomad_')
|
||||
* Results are cached for 5 seconds and concurrent callers share a single in-flight request,
|
||||
* preventing Docker socket congestion during rapid page navigation.
|
||||
*/
|
||||
async getServicesStatus(): Promise<{ service_name: string; status: string }[]> {
|
||||
const now = Date.now()
|
||||
if (this._servicesStatusCache && now < this._servicesStatusCache.expiresAt) {
|
||||
return this._servicesStatusCache.data
|
||||
}
|
||||
if (this._servicesStatusInflight) return this._servicesStatusInflight
|
||||
|
||||
this._servicesStatusInflight = this._fetchServicesStatus().then((data) => {
|
||||
this._servicesStatusCache = { data, expiresAt: Date.now() + 5000 }
|
||||
this._servicesStatusInflight = null
|
||||
return data
|
||||
}).catch((err) => {
|
||||
this._servicesStatusInflight = null
|
||||
throw err
|
||||
})
|
||||
return this._servicesStatusInflight
|
||||
}
|
||||
|
||||
/**
|
||||
* Invalidates the services status cache. Call this after any container state change
|
||||
* (start, stop, restart, install, uninstall) so the next read reflects reality.
|
||||
*/
|
||||
invalidateServicesStatusCache() {
|
||||
this._servicesStatusCache = null
|
||||
this._servicesStatusInflight = null
|
||||
}
|
||||
|
||||
private async _fetchServicesStatus(): Promise<{ service_name: string; status: string }[]> {
|
||||
async getServicesStatus(): Promise<
|
||||
{
|
||||
service_name: string
|
||||
status: string
|
||||
}[]
|
||||
> {
|
||||
try {
|
||||
const containers = await this.docker.listContainers({ all: true })
|
||||
const containerMap = new Map<string, Docker.ContainerInfo>()
|
||||
|
|
@ -165,7 +123,7 @@ export class DockerService {
|
|||
service_name: name,
|
||||
status: container.State,
|
||||
}))
|
||||
} catch (error: any) {
|
||||
} catch (error) {
|
||||
logger.error(`Error fetching services status: ${error.message}`)
|
||||
return []
|
||||
}
|
||||
|
|
@ -182,11 +140,6 @@ export class DockerService {
|
|||
return null
|
||||
}
|
||||
|
||||
if (serviceName === SERVICE_NAMES.OLLAMA) {
|
||||
const remoteUrl = await KVStore.getValue('ai.remoteOllamaUrl')
|
||||
if (remoteUrl) return remoteUrl
|
||||
}
|
||||
|
||||
const service = await Service.query()
|
||||
.where('service_name', serviceName)
|
||||
.andWhere('installed', true)
|
||||
|
|
@ -354,7 +307,7 @@ export class DockerService {
|
|||
`No existing container found, proceeding with installation...`
|
||||
)
|
||||
}
|
||||
} catch (error: any) {
|
||||
} catch (error) {
|
||||
logger.warn(`Error during container cleanup: ${error.message}`)
|
||||
this._broadcast(serviceName, 'cleanup-warning', `Warning during cleanup: ${error.message}`)
|
||||
}
|
||||
|
|
@ -373,7 +326,7 @@ export class DockerService {
|
|||
const volume = this.docker.getVolume(vol.Name)
|
||||
await volume.remove({ force: true })
|
||||
this._broadcast(serviceName, 'volume-removed', `Removed volume: ${vol.Name}`)
|
||||
} catch (error: any) {
|
||||
} catch (error) {
|
||||
logger.warn(`Failed to remove volume ${vol.Name}: ${error.message}`)
|
||||
}
|
||||
}
|
||||
|
|
@ -381,7 +334,7 @@ export class DockerService {
|
|||
if (serviceVolumes.length === 0) {
|
||||
this._broadcast(serviceName, 'no-volumes', `No volumes found to clear`)
|
||||
}
|
||||
} catch (error: any) {
|
||||
} catch (error) {
|
||||
logger.warn(`Error during volume cleanup: ${error.message}`)
|
||||
this._broadcast(
|
||||
serviceName,
|
||||
|
|
@ -394,7 +347,6 @@ export class DockerService {
|
|||
service.installed = false
|
||||
service.installation_status = 'installing'
|
||||
await service.save()
|
||||
this.invalidateServicesStatusCache()
|
||||
|
||||
// Step 5: Recreate the container
|
||||
this._broadcast(serviceName, 'recreating', `Recreating container...`)
|
||||
|
|
@ -410,7 +362,7 @@ export class DockerService {
|
|||
success: true,
|
||||
message: `Service ${serviceName} force reinstall initiated successfully. You can receive updates via server-sent events.`,
|
||||
}
|
||||
} catch (error: any) {
|
||||
} catch (error) {
|
||||
logger.error(`Force reinstall failed for ${serviceName}: ${error.message}`)
|
||||
await this._cleanupFailedInstallation(serviceName)
|
||||
return {
|
||||
|
|
@ -548,15 +500,6 @@ export class DockerService {
|
|||
}
|
||||
}
|
||||
|
||||
const ollamaEnv: string[] = []
|
||||
if (service.service_name === SERVICE_NAMES.OLLAMA) {
|
||||
ollamaEnv.push('OLLAMA_NO_CLOUD=1')
|
||||
const flashAttentionEnabled = await KVStore.getValue('ai.ollamaFlashAttention')
|
||||
if (flashAttentionEnabled !== false) {
|
||||
ollamaEnv.push('OLLAMA_FLASH_ATTENTION=1')
|
||||
}
|
||||
}
|
||||
|
||||
this._broadcast(
|
||||
service.service_name,
|
||||
'creating',
|
||||
|
|
@ -565,16 +508,11 @@ export class DockerService {
|
|||
const container = await this.docker.createContainer({
|
||||
Image: finalImage,
|
||||
name: service.service_name,
|
||||
Labels: {
|
||||
...(containerConfig?.Labels ?? {}),
|
||||
'com.docker.compose.project': 'project-nomad-managed',
|
||||
'io.project-nomad.managed': 'true',
|
||||
},
|
||||
...(containerConfig?.User && { User: containerConfig.User }),
|
||||
HostConfig: gpuHostConfig,
|
||||
...(containerConfig?.WorkingDir && { WorkingDir: containerConfig.WorkingDir }),
|
||||
...(containerConfig?.ExposedPorts && { ExposedPorts: containerConfig.ExposedPorts }),
|
||||
Env: [...(containerConfig?.Env ?? []), ...ollamaEnv],
|
||||
...(containerConfig?.Env && { Env: containerConfig.Env }),
|
||||
...(service.container_command ? { Cmd: service.container_command.split(' ') } : {}),
|
||||
// Ensure container is attached to the Nomad docker network in production
|
||||
...(process.env.NODE_ENV === 'production' && {
|
||||
|
|
@ -601,7 +539,6 @@ export class DockerService {
|
|||
service.installed = true
|
||||
service.installation_status = 'idle'
|
||||
await service.save()
|
||||
this.invalidateServicesStatusCache()
|
||||
|
||||
// Remove from active installs tracking
|
||||
this.activeInstallations.delete(service.service_name)
|
||||
|
|
@ -627,7 +564,7 @@ export class DockerService {
|
|||
'completed',
|
||||
`Service ${service.service_name} installation completed successfully.`
|
||||
)
|
||||
} catch (error: any) {
|
||||
} catch (error) {
|
||||
this._broadcast(
|
||||
service.service_name,
|
||||
'error',
|
||||
|
|
@ -643,7 +580,7 @@ export class DockerService {
|
|||
try {
|
||||
const containers = await this.docker.listContainers({ all: true })
|
||||
return containers.some((container) => container.Names.includes(`/${serviceName}`))
|
||||
} catch (error: any) {
|
||||
} catch (error) {
|
||||
logger.error(`Error checking if service container exists: ${error.message}`)
|
||||
return false
|
||||
}
|
||||
|
|
@ -663,7 +600,7 @@ export class DockerService {
|
|||
await dockerContainer.remove({ force: true })
|
||||
|
||||
return { success: true, message: `Service ${serviceName} container removed successfully` }
|
||||
} catch (error: any) {
|
||||
} catch (error) {
|
||||
logger.error(`Error removing service container: ${error.message}`)
|
||||
return {
|
||||
success: false,
|
||||
|
|
@ -711,12 +648,7 @@ export class DockerService {
|
|||
'preinstall',
|
||||
`Downloaded Wikipedia ZIM file to ${filepath}`
|
||||
)
|
||||
|
||||
// Generate the initial kiwix library XML before the container is created
|
||||
const kiwixLibraryService = new KiwixLibraryService()
|
||||
await kiwixLibraryService.rebuildFromDisk()
|
||||
this._broadcast(SERVICE_NAMES.KIWIX, 'preinstall', 'Generated kiwix library XML.')
|
||||
} catch (error: any) {
|
||||
} catch (error) {
|
||||
this._broadcast(
|
||||
SERVICE_NAMES.KIWIX,
|
||||
'preinstall-error',
|
||||
|
|
@ -739,121 +671,13 @@ export class DockerService {
|
|||
await this._removeServiceContainer(serviceName)
|
||||
|
||||
logger.info(`[DockerService] Cleaned up failed installation for ${serviceName}`)
|
||||
} catch (error: any) {
|
||||
} catch (error) {
|
||||
logger.error(
|
||||
`[DockerService] Failed to cleanup installation for ${serviceName}: ${error.message}`
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Checks whether the running kiwix container is using the legacy glob-pattern command
|
||||
* (`*.zim --address=all`) rather than the library-file command. Used to detect containers
|
||||
* that need to be migrated to library mode.
|
||||
*/
|
||||
async isKiwixOnLegacyConfig(): Promise<boolean> {
|
||||
try {
|
||||
const containers = await this.docker.listContainers({ all: true })
|
||||
const info = containers.find((c) => c.Names.includes(`/${SERVICE_NAMES.KIWIX}`))
|
||||
if (!info) return false
|
||||
|
||||
const inspected = await this.docker.getContainer(info.Id).inspect()
|
||||
const cmd: string[] = inspected.Config?.Cmd ?? []
|
||||
return cmd.some((arg) => arg.includes('*.zim'))
|
||||
} catch (err: any) {
|
||||
logger.warn(`[DockerService] Could not inspect kiwix container: ${err.message}`)
|
||||
return false
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Migrates the kiwix container from legacy glob mode (`*.zim`) to library mode
|
||||
* (`--library /data/kiwix-library.xml --monitorLibrary`).
|
||||
*
|
||||
* This is a non-destructive recreation: ZIM files and volumes are preserved.
|
||||
* The container is stopped, removed, and recreated with the correct library-mode command.
|
||||
* This function is authoritative: it writes the correct command to the DB itself rather than
|
||||
* trusting the DB to have been pre-updated by a separate migration.
|
||||
*/
|
||||
async migrateKiwixToLibraryMode(): Promise<void> {
|
||||
if (this.activeInstallations.has(SERVICE_NAMES.KIWIX)) {
|
||||
logger.warn('[DockerService] Kiwix migration already in progress, skipping duplicate call.')
|
||||
return
|
||||
}
|
||||
|
||||
this.activeInstallations.add(SERVICE_NAMES.KIWIX)
|
||||
|
||||
try {
|
||||
// Step 1: Build/update the XML from current disk state
|
||||
this._broadcast(SERVICE_NAMES.KIWIX, 'migrating', 'Migrating kiwix to library mode...')
|
||||
const kiwixLibraryService = new KiwixLibraryService()
|
||||
await kiwixLibraryService.rebuildFromDisk()
|
||||
this._broadcast(SERVICE_NAMES.KIWIX, 'migrating', 'Built kiwix library XML from existing ZIM files.')
|
||||
|
||||
// Step 2: Stop and remove old container (leave ZIM volumes intact)
|
||||
const containers = await this.docker.listContainers({ all: true })
|
||||
const containerInfo = containers.find((c) => c.Names.includes(`/${SERVICE_NAMES.KIWIX}`))
|
||||
if (containerInfo) {
|
||||
const oldContainer = this.docker.getContainer(containerInfo.Id)
|
||||
if (containerInfo.State === 'running') {
|
||||
await oldContainer.stop({ t: 10 }).catch((e: any) =>
|
||||
logger.warn(`[DockerService] Kiwix stop warning during migration: ${e.message}`)
|
||||
)
|
||||
}
|
||||
await oldContainer.remove({ force: true }).catch((e: any) =>
|
||||
logger.warn(`[DockerService] Kiwix remove warning during migration: ${e.message}`)
|
||||
)
|
||||
}
|
||||
|
||||
// Step 3: Read the service record and authoritatively set the correct command.
|
||||
// Do NOT rely on prior DB state — we write container_command here so the record
|
||||
// stays consistent regardless of whether the DB migration ran.
|
||||
const service = await Service.query().where('service_name', SERVICE_NAMES.KIWIX).first()
|
||||
if (!service) {
|
||||
throw new Error('Kiwix service record not found in DB during migration')
|
||||
}
|
||||
|
||||
service.container_command = KIWIX_LIBRARY_CMD
|
||||
service.installed = false
|
||||
service.installation_status = 'installing'
|
||||
await service.save()
|
||||
|
||||
const containerConfig = this._parseContainerConfig(service.container_config)
|
||||
|
||||
// Step 4: Recreate container directly (skipping _createContainer to avoid re-downloading
|
||||
// the bootstrap ZIM — ZIM files already exist on disk)
|
||||
this._broadcast(SERVICE_NAMES.KIWIX, 'migrating', 'Recreating kiwix container with library mode config...')
|
||||
const newContainer = await this.docker.createContainer({
|
||||
Image: service.container_image,
|
||||
name: service.service_name,
|
||||
HostConfig: containerConfig?.HostConfig ?? {},
|
||||
...(containerConfig?.ExposedPorts && { ExposedPorts: containerConfig.ExposedPorts }),
|
||||
Cmd: KIWIX_LIBRARY_CMD.split(' '),
|
||||
...(process.env.NODE_ENV === 'production' && {
|
||||
NetworkingConfig: {
|
||||
EndpointsConfig: {
|
||||
[DockerService.NOMAD_NETWORK]: {},
|
||||
},
|
||||
},
|
||||
}),
|
||||
})
|
||||
|
||||
await newContainer.start()
|
||||
|
||||
service.installed = true
|
||||
service.installation_status = 'idle'
|
||||
await service.save()
|
||||
this.activeInstallations.delete(SERVICE_NAMES.KIWIX)
|
||||
|
||||
this._broadcast(SERVICE_NAMES.KIWIX, 'migrated', 'Kiwix successfully migrated to library mode.')
|
||||
logger.info('[DockerService] Kiwix migration to library mode complete.')
|
||||
} catch (error: any) {
|
||||
logger.error(`[DockerService] Kiwix migration failed: ${error.message}`)
|
||||
await this._cleanupFailedInstallation(SERVICE_NAMES.KIWIX)
|
||||
throw error
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Detect GPU type and toolkit availability.
|
||||
* Primary: Check Docker runtimes via docker.info() (works from inside containers).
|
||||
|
|
@ -870,7 +694,7 @@ export class DockerService {
|
|||
await this._persistGPUType('nvidia')
|
||||
return { type: 'nvidia' }
|
||||
}
|
||||
} catch (error: any) {
|
||||
} catch (error) {
|
||||
logger.warn(`[DockerService] Could not query Docker info for GPU runtimes: ${error.message}`)
|
||||
}
|
||||
|
||||
|
|
@ -887,7 +711,7 @@ export class DockerService {
|
|||
logger.warn('[DockerService] NVIDIA GPU detected via lspci but NVIDIA Container Toolkit is not installed')
|
||||
return { type: 'none', toolkitMissing: true }
|
||||
}
|
||||
} catch (error: any) {
|
||||
} catch (error) {
|
||||
// lspci not available (likely inside Docker container), continue
|
||||
}
|
||||
|
||||
|
|
@ -902,7 +726,7 @@ export class DockerService {
|
|||
await this._persistGPUType('amd')
|
||||
return { type: 'amd' }
|
||||
}
|
||||
} catch (error: any) {
|
||||
} catch (error) {
|
||||
// lspci not available, continue
|
||||
}
|
||||
|
||||
|
|
@ -921,7 +745,7 @@ export class DockerService {
|
|||
|
||||
logger.info('[DockerService] No GPU detected')
|
||||
return { type: 'none' }
|
||||
} catch (error: any) {
|
||||
} catch (error) {
|
||||
logger.warn(`[DockerService] Error detecting GPU type: ${error.message}`)
|
||||
return { type: 'none' }
|
||||
}
|
||||
|
|
@ -931,7 +755,7 @@ export class DockerService {
|
|||
try {
|
||||
await KVStore.setValue('gpu.type', type)
|
||||
logger.info(`[DockerService] Persisted GPU type '${type}' to KV store`)
|
||||
} catch (error: any) {
|
||||
} catch (error) {
|
||||
logger.warn(`[DockerService] Failed to persist GPU type: ${error.message}`)
|
||||
}
|
||||
}
|
||||
|
|
@ -1126,7 +950,7 @@ export class DockerService {
|
|||
let newContainer: any
|
||||
try {
|
||||
newContainer = await this.docker.createContainer(newContainerConfig)
|
||||
} catch (createError: any) {
|
||||
} catch (createError) {
|
||||
// Rollback: rename old container back
|
||||
this._broadcast(serviceName, 'update-rollback', `Failed to create new container: ${createError.message}. Rolling back...`)
|
||||
const rollbackContainer = this.docker.getContainer((await this.docker.listContainers({ all: true })).find((c) => c.Names.includes(`/${oldName}`))!.Id)
|
||||
|
|
@ -1199,7 +1023,7 @@ export class DockerService {
|
|||
message: `Update failed: new container did not stay running. Rolled back to previous version.`,
|
||||
}
|
||||
}
|
||||
} catch (error: any) {
|
||||
} catch (error) {
|
||||
this.activeInstallations.delete(serviceName)
|
||||
this._broadcast(
|
||||
serviceName,
|
||||
|
|
@ -1234,7 +1058,7 @@ export class DockerService {
|
|||
}
|
||||
|
||||
return JSON.parse(toParse)
|
||||
} catch (error: any) {
|
||||
} catch (error) {
|
||||
logger.error(`Failed to parse container configuration: ${error.message}`)
|
||||
throw new Error(`Invalid container configuration: ${error.message}`)
|
||||
}
|
||||
|
|
@ -1251,7 +1075,7 @@ export class DockerService {
|
|||
|
||||
// Check if any image has a RepoTag that matches the requested image
|
||||
return images.some((image) => image.RepoTags && image.RepoTags.includes(imageName))
|
||||
} catch (error: any) {
|
||||
} catch (error) {
|
||||
logger.warn(`Error checking if image exists: ${error.message}`)
|
||||
// If run into an error, assume the image does not exist
|
||||
return false
|
||||
|
|
|
|||
|
|
@ -2,64 +2,27 @@ import { inject } from '@adonisjs/core'
|
|||
import { QueueService } from './queue_service.js'
|
||||
import { RunDownloadJob } from '#jobs/run_download_job'
|
||||
import { DownloadModelJob } from '#jobs/download_model_job'
|
||||
import { DownloadJobWithProgress, DownloadProgressData } from '../../types/downloads.js'
|
||||
import { DownloadJobWithProgress } from '../../types/downloads.js'
|
||||
import { normalize } from 'path'
|
||||
import { deleteFileIfExists } from '../utils/fs.js'
|
||||
|
||||
@inject()
|
||||
export class DownloadService {
|
||||
constructor(private queueService: QueueService) {}
|
||||
|
||||
private parseProgress(progress: any): { percent: number; downloadedBytes?: number; totalBytes?: number; lastProgressTime?: number } {
|
||||
if (typeof progress === 'object' && progress !== null && 'percent' in progress) {
|
||||
const p = progress as DownloadProgressData
|
||||
return {
|
||||
percent: p.percent,
|
||||
downloadedBytes: p.downloadedBytes,
|
||||
totalBytes: p.totalBytes,
|
||||
lastProgressTime: p.lastProgressTime,
|
||||
}
|
||||
}
|
||||
// Backward compat: plain integer from in-flight jobs during upgrade
|
||||
return { percent: parseInt(String(progress), 10) || 0 }
|
||||
}
|
||||
|
||||
async listDownloadJobs(filetype?: string): Promise<DownloadJobWithProgress[]> {
|
||||
// Get regular file download jobs (zim, map, etc.) — query each state separately so we can
|
||||
// tag each job with its actual BullMQ state rather than guessing from progress data.
|
||||
// Get regular file download jobs (zim, map, etc.)
|
||||
const queue = this.queueService.getQueue(RunDownloadJob.queue)
|
||||
type FileJobState = 'waiting' | 'active' | 'delayed' | 'failed'
|
||||
const fileJobs = await queue.getJobs(['waiting', 'active', 'delayed', 'failed'])
|
||||
|
||||
const [waitingJobs, activeJobs, delayedJobs, failedJobs] = await Promise.all([
|
||||
queue.getJobs(['waiting']),
|
||||
queue.getJobs(['active']),
|
||||
queue.getJobs(['delayed']),
|
||||
queue.getJobs(['failed']),
|
||||
])
|
||||
|
||||
const taggedFileJobs: Array<{ job: (typeof waitingJobs)[0]; state: FileJobState }> = [
|
||||
...waitingJobs.map((j) => ({ job: j, state: 'waiting' as const })),
|
||||
...activeJobs.map((j) => ({ job: j, state: 'active' as const })),
|
||||
...delayedJobs.map((j) => ({ job: j, state: 'delayed' as const })),
|
||||
...failedJobs.map((j) => ({ job: j, state: 'failed' as const })),
|
||||
]
|
||||
|
||||
const fileDownloads = taggedFileJobs.map(({ job, state }) => {
|
||||
const parsed = this.parseProgress(job.progress)
|
||||
return {
|
||||
const fileDownloads = fileJobs.map((job) => ({
|
||||
jobId: job.id!.toString(),
|
||||
url: job.data.url,
|
||||
progress: parsed.percent,
|
||||
progress: parseInt(job.progress.toString(), 10),
|
||||
filepath: normalize(job.data.filepath),
|
||||
filetype: job.data.filetype,
|
||||
title: job.data.title || undefined,
|
||||
downloadedBytes: parsed.downloadedBytes,
|
||||
totalBytes: parsed.totalBytes || job.data.totalBytes || undefined,
|
||||
lastProgressTime: parsed.lastProgressTime,
|
||||
status: state,
|
||||
status: (job.failedReason ? 'failed' : 'active') as 'active' | 'failed',
|
||||
failedReason: job.failedReason || undefined,
|
||||
}
|
||||
})
|
||||
}))
|
||||
|
||||
// Get Ollama model download jobs
|
||||
const modelQueue = this.queueService.getQueue(DownloadModelJob.queue)
|
||||
|
|
@ -93,106 +56,9 @@ export class DownloadService {
|
|||
const queue = this.queueService.getQueue(queueName)
|
||||
const job = await queue.getJob(jobId)
|
||||
if (job) {
|
||||
try {
|
||||
await job.remove()
|
||||
} catch {
|
||||
// Job may be locked by the worker after cancel. Remove the stale lock and retry.
|
||||
try {
|
||||
const client = await queue.client
|
||||
await client.del(`bull:${queueName}:${jobId}:lock`)
|
||||
await job.remove()
|
||||
} catch {
|
||||
// Last resort: already removed or truly stuck
|
||||
}
|
||||
}
|
||||
return
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
async cancelJob(jobId: string): Promise<{ success: boolean; message: string }> {
|
||||
const queue = this.queueService.getQueue(RunDownloadJob.queue)
|
||||
const job = await queue.getJob(jobId)
|
||||
|
||||
if (!job) {
|
||||
// Job already completed (removeOnComplete: true) or doesn't exist
|
||||
return { success: true, message: 'Job not found (may have already completed)' }
|
||||
}
|
||||
|
||||
const filepath = job.data.filepath
|
||||
|
||||
// Signal the worker process to abort the download via Redis
|
||||
await RunDownloadJob.signalCancel(jobId)
|
||||
|
||||
// Also try in-memory abort (works if worker is in same process)
|
||||
RunDownloadJob.abortControllers.get(jobId)?.abort('user-cancel')
|
||||
RunDownloadJob.abortControllers.delete(jobId)
|
||||
|
||||
// Poll for terminal state (up to 4s at 250ms intervals) — cooperates with BullMQ's lifecycle
|
||||
// instead of force-removing an active job and losing the worker's failure/cleanup path.
|
||||
const POLL_INTERVAL_MS = 250
|
||||
const POLL_TIMEOUT_MS = 4000
|
||||
const deadline = Date.now() + POLL_TIMEOUT_MS
|
||||
let reachedTerminal = false
|
||||
|
||||
while (Date.now() < deadline) {
|
||||
await new Promise((resolve) => setTimeout(resolve, POLL_INTERVAL_MS))
|
||||
try {
|
||||
const state = await job.getState()
|
||||
if (state === 'failed' || state === 'completed' || state === 'unknown') {
|
||||
reachedTerminal = true
|
||||
break
|
||||
}
|
||||
} catch {
|
||||
reachedTerminal = true // getState() throws if job is already gone
|
||||
break
|
||||
}
|
||||
}
|
||||
|
||||
if (!reachedTerminal) {
|
||||
console.warn(`[DownloadService] cancelJob: job ${jobId} did not reach terminal state within timeout, removing anyway`)
|
||||
}
|
||||
|
||||
// Remove the BullMQ job
|
||||
try {
|
||||
await job.remove()
|
||||
} catch {
|
||||
// Lock contention fallback: clear lock and retry once
|
||||
try {
|
||||
const client = await queue.client
|
||||
await client.del(`bull:${RunDownloadJob.queue}:${jobId}:lock`)
|
||||
const updatedJob = await queue.getJob(jobId)
|
||||
if (updatedJob) await updatedJob.remove()
|
||||
} catch {
|
||||
// Best effort - job will be cleaned up on next dismiss attempt
|
||||
}
|
||||
}
|
||||
|
||||
// Delete the partial file from disk
|
||||
if (filepath) {
|
||||
try {
|
||||
await deleteFileIfExists(filepath)
|
||||
// Also try .tmp in case PR #448 staging is merged
|
||||
await deleteFileIfExists(filepath + '.tmp')
|
||||
} catch {
|
||||
// File may not exist yet (waiting job)
|
||||
}
|
||||
}
|
||||
|
||||
// If this was a Wikipedia download, update selection status to failed
|
||||
// (the worker's failed event may not fire if we removed the job first)
|
||||
if (job.data.filetype === 'zim' && job.data.url?.includes('wikipedia_en_')) {
|
||||
try {
|
||||
const { DockerService } = await import('#services/docker_service')
|
||||
const { ZimService } = await import('#services/zim_service')
|
||||
const dockerService = new DockerService()
|
||||
const zimService = new ZimService(dockerService)
|
||||
await zimService.onWikipediaDownloadComplete(job.data.url, false)
|
||||
} catch {
|
||||
// Best effort
|
||||
}
|
||||
}
|
||||
|
||||
return { success: true, message: 'Download cancelled and partial file deleted' }
|
||||
}
|
||||
}
|
||||
|
|
|
|||
|
|
@ -1,285 +0,0 @@
|
|||
import { XMLBuilder, XMLParser } from 'fast-xml-parser'
|
||||
import { readFile, writeFile, rename, readdir } from 'fs/promises'
|
||||
import { join } from 'path'
|
||||
import { Archive } from '@openzim/libzim'
|
||||
import { KIWIX_LIBRARY_XML_PATH, ZIM_STORAGE_PATH, ensureDirectoryExists } from '../utils/fs.js'
|
||||
import logger from '@adonisjs/core/services/logger'
|
||||
import { randomUUID } from 'node:crypto'
|
||||
|
||||
const CONTAINER_DATA_PATH = '/data'
|
||||
const XML_DECLARATION = '<?xml version="1.0" encoding="UTF-8"?>\n'
|
||||
|
||||
interface KiwixBook {
|
||||
id: string
|
||||
path: string
|
||||
title: string
|
||||
description?: string
|
||||
language?: string
|
||||
creator?: string
|
||||
publisher?: string
|
||||
name?: string
|
||||
flavour?: string
|
||||
tags?: string
|
||||
faviconMimeType?: string
|
||||
favicon?: string
|
||||
date?: string
|
||||
articleCount?: number
|
||||
mediaCount?: number
|
||||
size?: number
|
||||
}
|
||||
|
||||
export class KiwixLibraryService {
|
||||
getLibraryFilePath(): string {
|
||||
return join(process.cwd(), KIWIX_LIBRARY_XML_PATH)
|
||||
}
|
||||
|
||||
containerLibraryPath(): string {
|
||||
return '/data/kiwix-library.xml'
|
||||
}
|
||||
|
||||
private _filenameToTitle(filename: string): string {
|
||||
const withoutExt = filename.endsWith('.zim') ? filename.slice(0, -4) : filename
|
||||
const parts = withoutExt.split('_')
|
||||
// Drop last segment if it looks like a date (YYYY-MM)
|
||||
const lastPart = parts[parts.length - 1]
|
||||
const isDate = /^\d{4}-\d{2}$/.test(lastPart)
|
||||
const titleParts = isDate && parts.length > 1 ? parts.slice(0, -1) : parts
|
||||
return titleParts.map((p) => p.charAt(0).toUpperCase() + p.slice(1)).join(' ')
|
||||
}
|
||||
|
||||
/**
|
||||
* Reads all kiwix-manage-compatible metadata from a ZIM file, including the internal UUID,
|
||||
* rich text fields, and the base64-encoded favicon. Kiwix-serve uses the UUID for OPDS
|
||||
* catalog entries and illustration URLs (/catalog/v2/illustration/{uuid}).
|
||||
*
|
||||
* Returns null on any error so callers can fall back gracefully.
|
||||
*/
|
||||
private _readZimMetadata(zimFilePath: string): Partial<KiwixBook> | null {
|
||||
try {
|
||||
const archive = new Archive(zimFilePath)
|
||||
|
||||
const getMeta = (key: string): string | undefined => {
|
||||
try {
|
||||
return archive.getMetadata(key) || undefined
|
||||
} catch {
|
||||
return undefined
|
||||
}
|
||||
}
|
||||
|
||||
let favicon: string | undefined
|
||||
let faviconMimeType: string | undefined
|
||||
try {
|
||||
if (archive.illustrationSizes.size > 0) {
|
||||
const size = archive.illustrationSizes.has(48)
|
||||
? 48
|
||||
: ([...archive.illustrationSizes][0] as number)
|
||||
const item = archive.getIllustrationItem(size)
|
||||
favicon = item.data.data.toString('base64')
|
||||
faviconMimeType = item.mimetype || undefined
|
||||
}
|
||||
} catch {
|
||||
// ZIM has no illustration — that's fine
|
||||
}
|
||||
|
||||
const rawFilesize =
|
||||
typeof archive.filesize === 'bigint' ? Number(archive.filesize) : archive.filesize
|
||||
|
||||
return {
|
||||
id: archive.uuid || undefined,
|
||||
title: getMeta('Title'),
|
||||
description: getMeta('Description'),
|
||||
language: getMeta('Language'),
|
||||
creator: getMeta('Creator'),
|
||||
publisher: getMeta('Publisher'),
|
||||
name: getMeta('Name'),
|
||||
flavour: getMeta('Flavour'),
|
||||
tags: getMeta('Tags'),
|
||||
date: getMeta('Date'),
|
||||
articleCount: archive.articleCount,
|
||||
mediaCount: archive.mediaCount,
|
||||
size: Math.floor(rawFilesize / 1024),
|
||||
favicon,
|
||||
faviconMimeType,
|
||||
}
|
||||
} catch {
|
||||
return null
|
||||
}
|
||||
}
|
||||
|
||||
private _buildXml(books: KiwixBook[]): string {
|
||||
const builder = new XMLBuilder({
|
||||
ignoreAttributes: false,
|
||||
attributeNamePrefix: '@_',
|
||||
format: true,
|
||||
suppressEmptyNode: false,
|
||||
})
|
||||
|
||||
const obj: Record<string, any> = {
|
||||
library: {
|
||||
'@_version': '20110515',
|
||||
...(books.length > 0 && {
|
||||
book: books.map((b) => ({
|
||||
'@_id': b.id,
|
||||
'@_path': b.path,
|
||||
'@_title': b.title,
|
||||
...(b.description !== undefined && { '@_description': b.description }),
|
||||
...(b.language !== undefined && { '@_language': b.language }),
|
||||
...(b.creator !== undefined && { '@_creator': b.creator }),
|
||||
...(b.publisher !== undefined && { '@_publisher': b.publisher }),
|
||||
...(b.name !== undefined && { '@_name': b.name }),
|
||||
...(b.flavour !== undefined && { '@_flavour': b.flavour }),
|
||||
...(b.tags !== undefined && { '@_tags': b.tags }),
|
||||
...(b.faviconMimeType !== undefined && { '@_faviconMimeType': b.faviconMimeType }),
|
||||
...(b.favicon !== undefined && { '@_favicon': b.favicon }),
|
||||
...(b.date !== undefined && { '@_date': b.date }),
|
||||
...(b.articleCount !== undefined && { '@_articleCount': b.articleCount }),
|
||||
...(b.mediaCount !== undefined && { '@_mediaCount': b.mediaCount }),
|
||||
...(b.size !== undefined && { '@_size': b.size }),
|
||||
})),
|
||||
}),
|
||||
},
|
||||
}
|
||||
|
||||
return XML_DECLARATION + builder.build(obj)
|
||||
}
|
||||
|
||||
private async _atomicWrite(content: string): Promise<void> {
|
||||
const filePath = this.getLibraryFilePath()
|
||||
const tmpPath = `${filePath}.tmp.${randomUUID()}`
|
||||
await writeFile(tmpPath, content, 'utf-8')
|
||||
await rename(tmpPath, filePath)
|
||||
}
|
||||
|
||||
private _parseExistingBooks(xmlContent: string): KiwixBook[] {
|
||||
const parser = new XMLParser({
|
||||
ignoreAttributes: false,
|
||||
attributeNamePrefix: '@_',
|
||||
isArray: (name) => name === 'book',
|
||||
})
|
||||
|
||||
const parsed = parser.parse(xmlContent)
|
||||
const books: any[] = parsed?.library?.book ?? []
|
||||
|
||||
return books
|
||||
.map((b) => ({
|
||||
id: b['@_id'] ?? '',
|
||||
path: b['@_path'] ?? '',
|
||||
title: b['@_title'] ?? '',
|
||||
description: b['@_description'],
|
||||
language: b['@_language'],
|
||||
creator: b['@_creator'],
|
||||
publisher: b['@_publisher'],
|
||||
name: b['@_name'],
|
||||
flavour: b['@_flavour'],
|
||||
tags: b['@_tags'],
|
||||
faviconMimeType: b['@_faviconMimeType'],
|
||||
favicon: b['@_favicon'],
|
||||
date: b['@_date'],
|
||||
articleCount:
|
||||
b['@_articleCount'] !== undefined ? Number(b['@_articleCount']) : undefined,
|
||||
mediaCount: b['@_mediaCount'] !== undefined ? Number(b['@_mediaCount']) : undefined,
|
||||
size: b['@_size'] !== undefined ? Number(b['@_size']) : undefined,
|
||||
}))
|
||||
.filter((b) => b.id && b.path)
|
||||
}
|
||||
|
||||
async rebuildFromDisk(opts?: { excludeFilenames?: string[] }): Promise<void> {
|
||||
const dirPath = join(process.cwd(), ZIM_STORAGE_PATH)
|
||||
await ensureDirectoryExists(dirPath)
|
||||
|
||||
let entries: string[] = []
|
||||
try {
|
||||
entries = await readdir(dirPath)
|
||||
} catch {
|
||||
entries = []
|
||||
}
|
||||
|
||||
const excludeSet = new Set(opts?.excludeFilenames ?? [])
|
||||
const zimFiles = entries.filter((name) => name.endsWith('.zim') && !excludeSet.has(name))
|
||||
|
||||
const books: KiwixBook[] = zimFiles.map((filename) => {
|
||||
const meta = this._readZimMetadata(join(dirPath, filename))
|
||||
const containerPath = `${CONTAINER_DATA_PATH}/${filename}`
|
||||
return {
|
||||
...meta,
|
||||
// Override fields that must be derived locally, not from ZIM metadata
|
||||
id: meta?.id ?? filename.slice(0, -4),
|
||||
path: containerPath,
|
||||
title: meta?.title ?? this._filenameToTitle(filename),
|
||||
}
|
||||
})
|
||||
|
||||
const xml = this._buildXml(books)
|
||||
await this._atomicWrite(xml)
|
||||
logger.info(`[KiwixLibraryService] Rebuilt library XML with ${books.length} book(s).`)
|
||||
}
|
||||
|
||||
async addBook(filename: string): Promise<void> {
|
||||
const zimFilename = filename.endsWith('.zim') ? filename : `${filename}.zim`
|
||||
const containerPath = `${CONTAINER_DATA_PATH}/${zimFilename}`
|
||||
|
||||
const filePath = this.getLibraryFilePath()
|
||||
let existingBooks: KiwixBook[] = []
|
||||
|
||||
try {
|
||||
const content = await readFile(filePath, 'utf-8')
|
||||
existingBooks = this._parseExistingBooks(content)
|
||||
} catch (err: any) {
|
||||
if (err.code === 'ENOENT') {
|
||||
// XML doesn't exist yet — rebuild from disk; the completed download is already there
|
||||
await this.rebuildFromDisk()
|
||||
return
|
||||
}
|
||||
throw err
|
||||
}
|
||||
|
||||
if (existingBooks.some((b) => b.path === containerPath)) {
|
||||
logger.info(`[KiwixLibraryService] ${zimFilename} already in library, skipping.`)
|
||||
return
|
||||
}
|
||||
|
||||
const fullPath = join(process.cwd(), ZIM_STORAGE_PATH, zimFilename)
|
||||
const meta = this._readZimMetadata(fullPath)
|
||||
|
||||
existingBooks.push({
|
||||
...meta,
|
||||
id: meta?.id ?? zimFilename.slice(0, -4),
|
||||
path: containerPath,
|
||||
title: meta?.title ?? this._filenameToTitle(zimFilename),
|
||||
})
|
||||
|
||||
const xml = this._buildXml(existingBooks)
|
||||
await this._atomicWrite(xml)
|
||||
logger.info(`[KiwixLibraryService] Added ${zimFilename} to library XML.`)
|
||||
}
|
||||
|
||||
async removeBook(filename: string): Promise<void> {
|
||||
const zimFilename = filename.endsWith('.zim') ? filename : `${filename}.zim`
|
||||
const containerPath = `${CONTAINER_DATA_PATH}/${zimFilename}`
|
||||
|
||||
const filePath = this.getLibraryFilePath()
|
||||
let existingBooks: KiwixBook[] = []
|
||||
|
||||
try {
|
||||
const content = await readFile(filePath, 'utf-8')
|
||||
existingBooks = this._parseExistingBooks(content)
|
||||
} catch (err: any) {
|
||||
if (err.code === 'ENOENT') {
|
||||
logger.warn(`[KiwixLibraryService] Library XML not found, nothing to remove.`)
|
||||
return
|
||||
}
|
||||
throw err
|
||||
}
|
||||
|
||||
const filtered = existingBooks.filter((b) => b.path !== containerPath)
|
||||
|
||||
if (filtered.length === existingBooks.length) {
|
||||
logger.info(`[KiwixLibraryService] ${zimFilename} not found in library, nothing to remove.`)
|
||||
return
|
||||
}
|
||||
|
||||
const xml = this._buildXml(filtered)
|
||||
await this._atomicWrite(xml)
|
||||
logger.info(`[KiwixLibraryService] Removed ${zimFilename} from library XML.`)
|
||||
}
|
||||
}
|
||||
|
|
@ -21,16 +21,6 @@ import InstalledResource from '#models/installed_resource'
|
|||
import { CollectionManifestService } from './collection_manifest_service.js'
|
||||
import type { CollectionWithStatus, MapsSpec } from '../../types/collections.js'
|
||||
|
||||
const PROTOMAPS_BUILDS_METADATA_URL = 'https://build-metadata.protomaps.dev/builds.json'
|
||||
const PROTOMAPS_BUILD_BASE_URL = 'https://build.protomaps.com'
|
||||
|
||||
export interface ProtomapsBuildInfo {
|
||||
url: string
|
||||
date: string
|
||||
size: number
|
||||
key: string
|
||||
}
|
||||
|
||||
const BASE_ASSETS_MIME_TYPES = [
|
||||
'application/gzip',
|
||||
'application/x-gzip',
|
||||
|
|
@ -119,7 +109,7 @@ export class MapService implements IMapService {
|
|||
const downloadFilenames: string[] = []
|
||||
|
||||
for (const resource of toDownload) {
|
||||
const existing = await RunDownloadJob.getActiveByUrl(resource.url)
|
||||
const existing = await RunDownloadJob.getByUrl(resource.url)
|
||||
if (existing) {
|
||||
logger.warn(`[MapService] Download already in progress for URL ${resource.url}, skipping.`)
|
||||
continue
|
||||
|
|
@ -141,7 +131,6 @@ export class MapService implements IMapService {
|
|||
allowedMimeTypes: PMTILES_MIME_TYPES,
|
||||
forceNew: true,
|
||||
filetype: 'map',
|
||||
title: (resource as any).title || undefined,
|
||||
resourceMetadata: {
|
||||
resource_id: resource.id,
|
||||
version: resource.version,
|
||||
|
|
@ -190,7 +179,7 @@ export class MapService implements IMapService {
|
|||
throw new Error(`Invalid PMTiles file URL: ${url}. URL must end with .pmtiles`)
|
||||
}
|
||||
|
||||
const existing = await RunDownloadJob.getActiveByUrl(url)
|
||||
const existing = await RunDownloadJob.getByUrl(url)
|
||||
if (existing) {
|
||||
throw new Error(`Download already in progress for URL ${url}`)
|
||||
}
|
||||
|
|
@ -409,76 +398,6 @@ export class MapService implements IMapService {
|
|||
return template
|
||||
}
|
||||
|
||||
async getGlobalMapInfo(): Promise<ProtomapsBuildInfo> {
|
||||
const { default: axios } = await import('axios')
|
||||
const response = await axios.get(PROTOMAPS_BUILDS_METADATA_URL, { timeout: 15000 })
|
||||
const builds = response.data as Array<{ key: string; size: number }>
|
||||
|
||||
if (!builds || builds.length === 0) {
|
||||
throw new Error('No protomaps builds found')
|
||||
}
|
||||
|
||||
// Latest build first
|
||||
const sorted = builds.sort((a, b) => b.key.localeCompare(a.key))
|
||||
const latest = sorted[0]
|
||||
|
||||
const dateStr = latest.key.replace('.pmtiles', '')
|
||||
const date = `${dateStr.slice(0, 4)}-${dateStr.slice(4, 6)}-${dateStr.slice(6, 8)}`
|
||||
|
||||
return {
|
||||
url: `${PROTOMAPS_BUILD_BASE_URL}/${latest.key}`,
|
||||
date,
|
||||
size: latest.size,
|
||||
key: latest.key,
|
||||
}
|
||||
}
|
||||
|
||||
async downloadGlobalMap(): Promise<{ filename: string; jobId?: string }> {
|
||||
const info = await this.getGlobalMapInfo()
|
||||
|
||||
const existing = await RunDownloadJob.getByUrl(info.url)
|
||||
if (existing) {
|
||||
throw new Error(`Download already in progress for URL ${info.url}`)
|
||||
}
|
||||
|
||||
const basePath = resolve(join(this.baseDirPath, 'pmtiles'))
|
||||
const filepath = resolve(join(basePath, info.key))
|
||||
|
||||
// Prevent path traversal — resolved path must stay within the storage directory
|
||||
if (!filepath.startsWith(basePath + sep)) {
|
||||
throw new Error('Invalid filename')
|
||||
}
|
||||
|
||||
// First, ensure base assets are present - the global map depends on them
|
||||
const baseAssetsExist = await this.ensureBaseAssets()
|
||||
if (!baseAssetsExist) {
|
||||
throw new Error(
|
||||
'Base map assets are missing and could not be downloaded. Please check your connection and try again.'
|
||||
)
|
||||
}
|
||||
|
||||
// forceNew: false so retries resume partial downloads
|
||||
const result = await RunDownloadJob.dispatch({
|
||||
url: info.url,
|
||||
filepath,
|
||||
timeout: 30000,
|
||||
allowedMimeTypes: PMTILES_MIME_TYPES,
|
||||
forceNew: false,
|
||||
filetype: 'map',
|
||||
})
|
||||
|
||||
if (!result.job) {
|
||||
throw new Error('Failed to dispatch download job')
|
||||
}
|
||||
|
||||
logger.info(`[MapService] Dispatched global map download job ${result.job.id}`)
|
||||
|
||||
return {
|
||||
filename: info.key,
|
||||
jobId: result.job?.id,
|
||||
}
|
||||
}
|
||||
|
||||
async delete(file: string): Promise<void> {
|
||||
let fileName = file
|
||||
if (!fileName.endsWith('.pmtiles')) {
|
||||
|
|
@ -511,18 +430,8 @@ export class MapService implements IMapService {
|
|||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Gets the appropriate public URL for a map asset depending on environment. The host and protocol that the user
|
||||
* is accessing the maps from must match the host and protocol used in the generated URLs, otherwise maps will fail to load.
|
||||
* If you make changes to this function, you need to ensure it handles all the following cases correctly:
|
||||
* - No host provided (should default to localhost or env URL)
|
||||
* - Host provided as full URL (e.g. "http://example.com:8080")
|
||||
* - Host provided as host:port (e.g. "example.com:8080")
|
||||
* - Host provided as bare hostname (e.g. "example.com")
|
||||
* @param specifiedHost - the host as provided by the user/request, can be null or in various formats (full URL, host:port, bare hostname)
|
||||
* @param childPath - the path to append to the base URL (e.g. "basemaps-assets", "pmtiles")
|
||||
* @param protocol - the protocol to use in the generated URL (e.g. "http", "https"), defaults to "http"
|
||||
* @returns the public URL for the map asset
|
||||
/*
|
||||
* Gets the appropriate public URL for a map asset depending on environment
|
||||
*/
|
||||
private getPublicFileBaseUrl(specifiedHost: string | null, childPath: string, protocol: string = 'http'): string {
|
||||
function getHost() {
|
||||
|
|
@ -537,25 +446,8 @@ export class MapService implements IMapService {
|
|||
}
|
||||
}
|
||||
|
||||
function specifiedHostOrDefault() {
|
||||
if (specifiedHost === null) {
|
||||
return getHost()
|
||||
}
|
||||
// Try as a full URL first (e.g. "http://example.com:8080")
|
||||
try {
|
||||
const specifiedUrl = new URL(specifiedHost)
|
||||
if (specifiedUrl.host) return specifiedUrl.host
|
||||
} catch {}
|
||||
// Try as a bare host or host:port (e.g. "nomad-box:8080", "192.168.1.1:8080", "example.com")
|
||||
try {
|
||||
const specifiedUrl = new URL(`http://${specifiedHost}`)
|
||||
if (specifiedUrl.host) return specifiedUrl.host
|
||||
} catch {}
|
||||
return getHost()
|
||||
}
|
||||
|
||||
const host = specifiedHostOrDefault();
|
||||
const withProtocol = `${protocol}://${host}`
|
||||
const host = specifiedHost || getHost()
|
||||
const withProtocol = host.startsWith('http') ? host : `${protocol}://${host}`
|
||||
const baseUrlPath =
|
||||
process.env.NODE_ENV === 'production' ? childPath : urlJoin(this.mapStoragePath, childPath)
|
||||
|
||||
|
|
|
|||
|
|
@ -1,7 +1,5 @@
|
|||
import { inject } from '@adonisjs/core'
|
||||
import OpenAI from 'openai'
|
||||
import type { ChatCompletionChunk, ChatCompletionMessageParam } from 'openai/resources/chat/completions.js'
|
||||
import type { Stream } from 'openai/streaming.js'
|
||||
import { ChatRequest, Ollama } from 'ollama'
|
||||
import { NomadOllamaModel } from '../../types/ollama.js'
|
||||
import { FALLBACK_RECOMMENDED_OLLAMA_MODELS } from '../../constants/ollama.js'
|
||||
import fs from 'node:fs/promises'
|
||||
|
|
@ -15,93 +13,51 @@ import Fuse, { IFuseOptions } from 'fuse.js'
|
|||
import { BROADCAST_CHANNELS } from '../../constants/broadcast.js'
|
||||
import env from '#start/env'
|
||||
import { NOMAD_API_DEFAULT_BASE_URL } from '../../constants/misc.js'
|
||||
import KVStore from '#models/kv_store'
|
||||
|
||||
const NOMAD_MODELS_API_PATH = '/api/v1/ollama/models'
|
||||
const MODELS_CACHE_FILE = path.join(process.cwd(), 'storage', 'ollama-models-cache.json')
|
||||
const CACHE_MAX_AGE_MS = 24 * 60 * 60 * 1000 // 24 hours
|
||||
|
||||
export type NomadInstalledModel = {
|
||||
name: string
|
||||
size: number
|
||||
digest?: string
|
||||
details?: Record<string, any>
|
||||
}
|
||||
|
||||
export type NomadChatResponse = {
|
||||
message: { content: string; thinking?: string }
|
||||
done: boolean
|
||||
model: string
|
||||
}
|
||||
|
||||
export type NomadChatStreamChunk = {
|
||||
message: { content: string; thinking?: string }
|
||||
done: boolean
|
||||
}
|
||||
|
||||
type ChatInput = {
|
||||
model: string
|
||||
messages: Array<{ role: 'system' | 'user' | 'assistant'; content: string }>
|
||||
think?: boolean | 'medium'
|
||||
stream?: boolean
|
||||
numCtx?: number
|
||||
}
|
||||
|
||||
@inject()
|
||||
export class OllamaService {
|
||||
private openai: OpenAI | null = null
|
||||
private baseUrl: string | null = null
|
||||
private initPromise: Promise<void> | null = null
|
||||
private isOllamaNative: boolean | null = null
|
||||
private ollama: Ollama | null = null
|
||||
private ollamaInitPromise: Promise<void> | null = null
|
||||
|
||||
constructor() { }
|
||||
|
||||
private async _initialize() {
|
||||
if (!this.initPromise) {
|
||||
this.initPromise = (async () => {
|
||||
// Check KVStore for a custom base URL (remote Ollama, LM Studio, llama.cpp, etc.)
|
||||
const customUrl = (await KVStore.getValue('ai.remoteOllamaUrl')) as string | null
|
||||
if (customUrl && customUrl.trim()) {
|
||||
this.baseUrl = customUrl.trim().replace(/\/$/, '')
|
||||
} else {
|
||||
// Fall back to the local Ollama container managed by Docker
|
||||
private async _initializeOllamaClient() {
|
||||
if (!this.ollamaInitPromise) {
|
||||
this.ollamaInitPromise = (async () => {
|
||||
const dockerService = new (await import('./docker_service.js')).DockerService()
|
||||
const ollamaUrl = await dockerService.getServiceURL(SERVICE_NAMES.OLLAMA)
|
||||
if (!ollamaUrl) {
|
||||
const qdrantUrl = await dockerService.getServiceURL(SERVICE_NAMES.OLLAMA)
|
||||
if (!qdrantUrl) {
|
||||
throw new Error('Ollama service is not installed or running.')
|
||||
}
|
||||
this.baseUrl = ollamaUrl.trim().replace(/\/$/, '')
|
||||
}
|
||||
|
||||
this.openai = new OpenAI({
|
||||
apiKey: 'nomad', // Required by SDK; not validated by Ollama/LM Studio/llama.cpp
|
||||
baseURL: `${this.baseUrl}/v1`,
|
||||
})
|
||||
this.ollama = new Ollama({ host: qdrantUrl })
|
||||
})()
|
||||
}
|
||||
return this.initPromise
|
||||
return this.ollamaInitPromise
|
||||
}
|
||||
|
||||
private async _ensureDependencies() {
|
||||
if (!this.openai) {
|
||||
await this._initialize()
|
||||
if (!this.ollama) {
|
||||
await this._initializeOllamaClient()
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Downloads a model from Ollama with progress tracking. Only works with Ollama backends.
|
||||
* Use dispatchModelDownload() for background job processing where possible.
|
||||
* Downloads a model from the Ollama service with progress tracking. Where possible,
|
||||
* one should dispatch a background job instead of calling this method directly to avoid long blocking.
|
||||
* @param model Model name to download
|
||||
* @returns Success status and message
|
||||
*/
|
||||
async downloadModel(
|
||||
model: string,
|
||||
progressCallback?: (percent: number) => void
|
||||
): Promise<{ success: boolean; message: string; retryable?: boolean }> {
|
||||
async downloadModel(model: string, progressCallback?: (percent: number) => void): Promise<{ success: boolean; message: string; retryable?: boolean }> {
|
||||
try {
|
||||
await this._ensureDependencies()
|
||||
if (!this.baseUrl) {
|
||||
return { success: false, message: 'AI service is not initialized.' }
|
||||
if (!this.ollama) {
|
||||
throw new Error('Ollama client is not initialized.')
|
||||
}
|
||||
|
||||
try {
|
||||
// See if model is already installed
|
||||
const installedModels = await this.getModels()
|
||||
if (installedModels && installedModels.some((m) => m.name === model)) {
|
||||
|
|
@ -109,49 +65,24 @@ export class OllamaService {
|
|||
return { success: true, message: 'Model is already installed.' }
|
||||
}
|
||||
|
||||
// Model pulling is an Ollama-only operation. Non-Ollama backends (LM Studio, llama.cpp, etc.)
|
||||
// return HTTP 200 for unknown endpoints, so the pull would appear to succeed but do nothing.
|
||||
if (this.isOllamaNative === false) {
|
||||
logger.warn(
|
||||
`[OllamaService] Non-Ollama backend detected — skipping model pull for "${model}". Load the model manually in your AI host.`
|
||||
)
|
||||
return {
|
||||
success: false,
|
||||
message: `Model "${model}" is not available in your AI host. Please load it manually (model pulling is only supported for Ollama backends).`,
|
||||
}
|
||||
}
|
||||
|
||||
// Stream pull via Ollama native API
|
||||
const pullResponse = await axios.post(
|
||||
`${this.baseUrl}/api/pull`,
|
||||
{ model, stream: true },
|
||||
{ responseType: 'stream', timeout: 0 }
|
||||
)
|
||||
|
||||
await new Promise<void>((resolve, reject) => {
|
||||
let buffer = ''
|
||||
pullResponse.data.on('data', (chunk: Buffer) => {
|
||||
buffer += chunk.toString()
|
||||
const lines = buffer.split('\n')
|
||||
buffer = lines.pop() || ''
|
||||
for (const line of lines) {
|
||||
if (!line.trim()) continue
|
||||
try {
|
||||
const parsed = JSON.parse(line)
|
||||
if (parsed.completed && parsed.total) {
|
||||
const percent = parseFloat(((parsed.completed / parsed.total) * 100).toFixed(2))
|
||||
this.broadcastDownloadProgress(model, percent)
|
||||
if (progressCallback) progressCallback(percent)
|
||||
}
|
||||
} catch {
|
||||
// ignore parse errors on partial lines
|
||||
}
|
||||
}
|
||||
})
|
||||
pullResponse.data.on('end', resolve)
|
||||
pullResponse.data.on('error', reject)
|
||||
// Returns AbortableAsyncIterator<ProgressResponse>
|
||||
const downloadStream = await this.ollama.pull({
|
||||
model,
|
||||
stream: true,
|
||||
})
|
||||
|
||||
for await (const chunk of downloadStream) {
|
||||
if (chunk.completed && chunk.total) {
|
||||
const percent = ((chunk.completed / chunk.total) * 100).toFixed(2)
|
||||
const percentNum = parseFloat(percent)
|
||||
|
||||
this.broadcastDownloadProgress(model, percentNum)
|
||||
if (progressCallback) {
|
||||
progressCallback(percentNum)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
logger.info(`[OllamaService] Model "${model}" downloaded successfully.`)
|
||||
return { success: true, message: 'Model downloaded successfully.' }
|
||||
} catch (error) {
|
||||
|
|
@ -197,257 +128,88 @@ export class OllamaService {
|
|||
}
|
||||
}
|
||||
|
||||
public async chat(chatRequest: ChatInput): Promise<NomadChatResponse> {
|
||||
public async getClient() {
|
||||
await this._ensureDependencies()
|
||||
if (!this.openai) {
|
||||
throw new Error('AI client is not initialized.')
|
||||
return this.ollama!
|
||||
}
|
||||
|
||||
const params: any = {
|
||||
model: chatRequest.model,
|
||||
messages: chatRequest.messages as ChatCompletionMessageParam[],
|
||||
public async chat(chatRequest: ChatRequest & { stream?: boolean }) {
|
||||
await this._ensureDependencies()
|
||||
if (!this.ollama) {
|
||||
throw new Error('Ollama client is not initialized.')
|
||||
}
|
||||
return await this.ollama.chat({
|
||||
...chatRequest,
|
||||
stream: false,
|
||||
}
|
||||
if (chatRequest.think) {
|
||||
params.think = chatRequest.think
|
||||
}
|
||||
if (chatRequest.numCtx) {
|
||||
params.num_ctx = chatRequest.numCtx
|
||||
})
|
||||
}
|
||||
|
||||
const response = await this.openai.chat.completions.create(params)
|
||||
const choice = response.choices[0]
|
||||
|
||||
return {
|
||||
message: {
|
||||
content: choice.message.content ?? '',
|
||||
thinking: (choice.message as any).thinking ?? undefined,
|
||||
},
|
||||
done: true,
|
||||
model: response.model,
|
||||
}
|
||||
}
|
||||
|
||||
public async chatStream(chatRequest: ChatInput): Promise<AsyncIterable<NomadChatStreamChunk>> {
|
||||
public async chatStream(chatRequest: ChatRequest) {
|
||||
await this._ensureDependencies()
|
||||
if (!this.openai) {
|
||||
throw new Error('AI client is not initialized.')
|
||||
if (!this.ollama) {
|
||||
throw new Error('Ollama client is not initialized.')
|
||||
}
|
||||
|
||||
const params: any = {
|
||||
model: chatRequest.model,
|
||||
messages: chatRequest.messages as ChatCompletionMessageParam[],
|
||||
return await this.ollama.chat({
|
||||
...chatRequest,
|
||||
stream: true,
|
||||
}
|
||||
if (chatRequest.think) {
|
||||
params.think = chatRequest.think
|
||||
}
|
||||
if (chatRequest.numCtx) {
|
||||
params.num_ctx = chatRequest.numCtx
|
||||
}
|
||||
|
||||
const stream = (await this.openai.chat.completions.create(params)) as unknown as Stream<ChatCompletionChunk>
|
||||
|
||||
// Returns how many trailing chars of `text` could be the start of `tag`
|
||||
function partialTagSuffix(tag: string, text: string): number {
|
||||
for (let len = Math.min(tag.length - 1, text.length); len >= 1; len--) {
|
||||
if (text.endsWith(tag.slice(0, len))) return len
|
||||
}
|
||||
return 0
|
||||
}
|
||||
|
||||
async function* normalize(): AsyncGenerator<NomadChatStreamChunk> {
|
||||
// Stateful parser for <think>...</think> tags that may be split across chunks.
|
||||
// Ollama provides thinking natively via delta.thinking; OpenAI-compatible backends
|
||||
// (LM Studio, llama.cpp, etc.) embed them inline in delta.content.
|
||||
let tagBuffer = ''
|
||||
let inThink = false
|
||||
|
||||
for await (const chunk of stream) {
|
||||
const delta = chunk.choices[0]?.delta
|
||||
const nativeThinking: string = (delta as any)?.thinking ?? ''
|
||||
const rawContent: string = delta?.content ?? ''
|
||||
|
||||
// Parse <think> tags out of the content stream
|
||||
tagBuffer += rawContent
|
||||
let parsedContent = ''
|
||||
let parsedThinking = ''
|
||||
|
||||
while (tagBuffer.length > 0) {
|
||||
if (inThink) {
|
||||
const closeIdx = tagBuffer.indexOf('</think>')
|
||||
if (closeIdx !== -1) {
|
||||
parsedThinking += tagBuffer.slice(0, closeIdx)
|
||||
tagBuffer = tagBuffer.slice(closeIdx + 8)
|
||||
inThink = false
|
||||
} else {
|
||||
const hold = partialTagSuffix('</think>', tagBuffer)
|
||||
parsedThinking += tagBuffer.slice(0, tagBuffer.length - hold)
|
||||
tagBuffer = tagBuffer.slice(tagBuffer.length - hold)
|
||||
break
|
||||
}
|
||||
} else {
|
||||
const openIdx = tagBuffer.indexOf('<think>')
|
||||
if (openIdx !== -1) {
|
||||
parsedContent += tagBuffer.slice(0, openIdx)
|
||||
tagBuffer = tagBuffer.slice(openIdx + 7)
|
||||
inThink = true
|
||||
} else {
|
||||
const hold = partialTagSuffix('<think>', tagBuffer)
|
||||
parsedContent += tagBuffer.slice(0, tagBuffer.length - hold)
|
||||
tagBuffer = tagBuffer.slice(tagBuffer.length - hold)
|
||||
break
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
yield {
|
||||
message: {
|
||||
content: parsedContent,
|
||||
thinking: nativeThinking + parsedThinking,
|
||||
},
|
||||
done: chunk.choices[0]?.finish_reason !== null && chunk.choices[0]?.finish_reason !== undefined,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return normalize()
|
||||
})
|
||||
}
|
||||
|
||||
public async checkModelHasThinking(modelName: string): Promise<boolean> {
|
||||
await this._ensureDependencies()
|
||||
if (!this.baseUrl) return false
|
||||
|
||||
try {
|
||||
const response = await axios.post(
|
||||
`${this.baseUrl}/api/show`,
|
||||
{ model: modelName },
|
||||
{ timeout: 5000 }
|
||||
)
|
||||
return Array.isArray(response.data?.capabilities) && response.data.capabilities.includes('thinking')
|
||||
} catch {
|
||||
// Non-Ollama backends don't expose /api/show — assume no thinking support
|
||||
return false
|
||||
}
|
||||
if (!this.ollama) {
|
||||
throw new Error('Ollama client is not initialized.')
|
||||
}
|
||||
|
||||
public async deleteModel(modelName: string): Promise<{ success: boolean; message: string }> {
|
||||
await this._ensureDependencies()
|
||||
if (!this.baseUrl) {
|
||||
return { success: false, message: 'AI service is not initialized.' }
|
||||
}
|
||||
|
||||
try {
|
||||
await axios.delete(`${this.baseUrl}/api/delete`, {
|
||||
data: { model: modelName },
|
||||
timeout: 10000,
|
||||
const modelInfo = await this.ollama.show({
|
||||
model: modelName,
|
||||
})
|
||||
return { success: true, message: `Model "${modelName}" deleted.` }
|
||||
} catch (error) {
|
||||
logger.error(
|
||||
`[OllamaService] Failed to delete model "${modelName}": ${error instanceof Error ? error.message : error}`
|
||||
)
|
||||
return { success: false, message: 'Failed to delete model. This may not be an Ollama backend.' }
|
||||
}
|
||||
|
||||
return modelInfo.capabilities.includes('thinking')
|
||||
}
|
||||
|
||||
/**
|
||||
* Generate embeddings for the given input strings.
|
||||
* Tries the Ollama native /api/embed endpoint first, falls back to /v1/embeddings.
|
||||
*/
|
||||
public async embed(model: string, input: string[]): Promise<{ embeddings: number[][] }> {
|
||||
public async deleteModel(modelName: string) {
|
||||
await this._ensureDependencies()
|
||||
if (!this.baseUrl || !this.openai) {
|
||||
throw new Error('AI service is not initialized.')
|
||||
if (!this.ollama) {
|
||||
throw new Error('Ollama client is not initialized.')
|
||||
}
|
||||
|
||||
try {
|
||||
// Prefer Ollama native endpoint (supports batch input natively)
|
||||
const response = await axios.post(
|
||||
`${this.baseUrl}/api/embed`,
|
||||
{ model, input },
|
||||
{ timeout: 60000 }
|
||||
)
|
||||
// Some backends (e.g. LM Studio) return HTTP 200 for unknown endpoints with an incompatible
|
||||
// body — validate explicitly before accepting the result.
|
||||
if (!Array.isArray(response.data?.embeddings)) {
|
||||
throw new Error('Invalid /api/embed response — missing embeddings array')
|
||||
}
|
||||
return { embeddings: response.data.embeddings }
|
||||
} catch {
|
||||
// Fall back to OpenAI-compatible /v1/embeddings
|
||||
// Explicitly request float format — some backends (e.g. LM Studio) don't reliably
|
||||
// implement the base64 encoding the OpenAI SDK requests by default.
|
||||
logger.info('[OllamaService] /api/embed unavailable, falling back to /v1/embeddings')
|
||||
const results = await this.openai.embeddings.create({ model, input, encoding_format: 'float' })
|
||||
return { embeddings: results.data.map((e) => e.embedding as number[]) }
|
||||
}
|
||||
return await this.ollama.delete({
|
||||
model: modelName,
|
||||
})
|
||||
}
|
||||
|
||||
public async getModels(includeEmbeddings = false): Promise<NomadInstalledModel[]> {
|
||||
public async getModels(includeEmbeddings = false) {
|
||||
await this._ensureDependencies()
|
||||
if (!this.baseUrl) {
|
||||
throw new Error('AI service is not initialized.')
|
||||
}
|
||||
|
||||
try {
|
||||
// Prefer the Ollama native endpoint which includes size and metadata
|
||||
const response = await axios.get(`${this.baseUrl}/api/tags`, { timeout: 5000 })
|
||||
// LM Studio returns HTTP 200 for unknown endpoints with an incompatible body — validate explicitly
|
||||
if (!Array.isArray(response.data?.models)) {
|
||||
throw new Error('Not an Ollama-compatible /api/tags response')
|
||||
}
|
||||
this.isOllamaNative = true
|
||||
const models: NomadInstalledModel[] = response.data.models
|
||||
if (includeEmbeddings) return models
|
||||
return models.filter((m) => !m.name.includes('embed'))
|
||||
} catch {
|
||||
// Fall back to the OpenAI-compatible /v1/models endpoint (LM Studio, llama.cpp, etc.)
|
||||
this.isOllamaNative = false
|
||||
logger.info('[OllamaService] /api/tags unavailable, falling back to /v1/models')
|
||||
try {
|
||||
const modelList = await this.openai!.models.list()
|
||||
const models: NomadInstalledModel[] = modelList.data.map((m) => ({ name: m.id, size: 0 }))
|
||||
if (includeEmbeddings) return models
|
||||
return models.filter((m) => !m.name.includes('embed'))
|
||||
} catch (err) {
|
||||
logger.error(
|
||||
`[OllamaService] Failed to list models: ${err instanceof Error ? err.message : err}`
|
||||
)
|
||||
return []
|
||||
if (!this.ollama) {
|
||||
throw new Error('Ollama client is not initialized.')
|
||||
}
|
||||
const response = await this.ollama.list()
|
||||
if (includeEmbeddings) {
|
||||
return response.models
|
||||
}
|
||||
// Filter out embedding models
|
||||
return response.models.filter((model) => !model.name.includes('embed'))
|
||||
}
|
||||
|
||||
async getAvailableModels(
|
||||
{
|
||||
sort,
|
||||
recommendedOnly,
|
||||
query,
|
||||
limit,
|
||||
force,
|
||||
}: {
|
||||
sort?: 'pulls' | 'name'
|
||||
recommendedOnly?: boolean
|
||||
query: string | null
|
||||
limit?: number
|
||||
force?: boolean
|
||||
} = {
|
||||
{ sort, recommendedOnly, query, limit, force }: { sort?: 'pulls' | 'name'; recommendedOnly?: boolean, query: string | null, limit?: number, force?: boolean } = {
|
||||
sort: 'pulls',
|
||||
recommendedOnly: false,
|
||||
query: null,
|
||||
limit: 15,
|
||||
}
|
||||
): Promise<{ models: NomadOllamaModel[]; hasMore: boolean } | null> {
|
||||
): Promise<{ models: NomadOllamaModel[], hasMore: boolean } | null> {
|
||||
try {
|
||||
const models = await this.retrieveAndRefreshModels(sort, force)
|
||||
if (!models) {
|
||||
// If we fail to get models from the API, return the fallback recommended models
|
||||
logger.warn(
|
||||
'[OllamaService] Returning fallback recommended models due to failure in fetching available models'
|
||||
)
|
||||
return {
|
||||
models: FALLBACK_RECOMMENDED_OLLAMA_MODELS,
|
||||
hasMore: false,
|
||||
hasMore: false
|
||||
}
|
||||
}
|
||||
|
||||
|
|
@ -455,13 +217,15 @@ export class OllamaService {
|
|||
const filteredModels = query ? this.fuseSearchModels(models, query) : models
|
||||
return {
|
||||
models: filteredModels.slice(0, limit || 15),
|
||||
hasMore: filteredModels.length > (limit || 15),
|
||||
hasMore: filteredModels.length > (limit || 15)
|
||||
}
|
||||
}
|
||||
|
||||
// If recommendedOnly is true, only return the first three models (if sorted by pulls, these will be the top 3)
|
||||
const sortedByPulls = sort === 'pulls' ? models : this.sortModels(models, 'pulls')
|
||||
const firstThree = sortedByPulls.slice(0, 3)
|
||||
|
||||
// Only return the first tag of each of these models (should be the most lightweight variant)
|
||||
const recommendedModels = firstThree.map((model) => {
|
||||
return {
|
||||
...model,
|
||||
|
|
@ -473,13 +237,13 @@ export class OllamaService {
|
|||
const filteredRecommendedModels = this.fuseSearchModels(recommendedModels, query)
|
||||
return {
|
||||
models: filteredRecommendedModels,
|
||||
hasMore: filteredRecommendedModels.length > (limit || 15),
|
||||
hasMore: filteredRecommendedModels.length > (limit || 15)
|
||||
}
|
||||
}
|
||||
|
||||
return {
|
||||
models: recommendedModels,
|
||||
hasMore: recommendedModels.length > (limit || 15),
|
||||
hasMore: recommendedModels.length > (limit || 15)
|
||||
}
|
||||
} catch (error) {
|
||||
logger.error(
|
||||
|
|
@ -519,6 +283,7 @@ export class OllamaService {
|
|||
|
||||
const rawModels = response.data.models as NomadOllamaModel[]
|
||||
|
||||
// Filter out tags where cloud is truthy, then remove models with no remaining tags
|
||||
const noCloud = rawModels
|
||||
.map((model) => ({
|
||||
...model,
|
||||
|
|
@ -530,7 +295,8 @@ export class OllamaService {
|
|||
return this.sortModels(noCloud, sort)
|
||||
} catch (error) {
|
||||
logger.error(
|
||||
`[OllamaService] Failed to retrieve models from Nomad API: ${error instanceof Error ? error.message : error}`
|
||||
`[OllamaService] Failed to retrieve models from Nomad API: ${error instanceof Error ? error.message : error
|
||||
}`
|
||||
)
|
||||
return null
|
||||
}
|
||||
|
|
@ -556,6 +322,7 @@ export class OllamaService {
|
|||
|
||||
return models
|
||||
} catch (error) {
|
||||
// Cache doesn't exist or is invalid
|
||||
if ((error as NodeJS.ErrnoException).code !== 'ENOENT') {
|
||||
logger.warn(
|
||||
`[OllamaService] Error reading cache: ${error instanceof Error ? error.message : error}`
|
||||
|
|
@ -579,6 +346,7 @@ export class OllamaService {
|
|||
|
||||
private sortModels(models: NomadOllamaModel[], sort?: 'pulls' | 'name'): NomadOllamaModel[] {
|
||||
if (sort === 'pulls') {
|
||||
// Sort by estimated pulls (it should be a string like "1.2K", "500", "4M" etc.)
|
||||
models.sort((a, b) => {
|
||||
const parsePulls = (pulls: string) => {
|
||||
const multiplier = pulls.endsWith('K')
|
||||
|
|
@ -596,6 +364,8 @@ export class OllamaService {
|
|||
models.sort((a, b) => a.name.localeCompare(b.name))
|
||||
}
|
||||
|
||||
// Always sort model.tags by the size field in descending order
|
||||
// Size is a string like '75GB', '8.5GB', '2GB' etc. Smaller models first
|
||||
models.forEach((model) => {
|
||||
if (model.tags && Array.isArray(model.tags)) {
|
||||
model.tags.sort((a, b) => {
|
||||
|
|
@ -608,7 +378,7 @@ export class OllamaService {
|
|||
? 1
|
||||
: size.endsWith('TB')
|
||||
? 1_000
|
||||
: 0
|
||||
: 0 // Unknown size format
|
||||
return parseFloat(size) * multiplier
|
||||
}
|
||||
return parseSize(a.size) - parseSize(b.size)
|
||||
|
|
@ -641,11 +411,11 @@ export class OllamaService {
|
|||
const options: IFuseOptions<NomadOllamaModel> = {
|
||||
ignoreDiacritics: true,
|
||||
keys: ['name', 'description', 'tags.name'],
|
||||
threshold: 0.3,
|
||||
threshold: 0.3, // lower threshold for stricter matching
|
||||
}
|
||||
|
||||
const fuse = new Fuse(models, options)
|
||||
|
||||
return fuse.search(query).map((result) => result.item)
|
||||
return fuse.search(query).map(result => result.item)
|
||||
}
|
||||
}
|
||||
|
|
|
|||
|
|
@ -8,8 +8,6 @@ import { deleteFileIfExists, determineFileType, getFile, getFileStatsIfExists, l
|
|||
import { PDFParse } from 'pdf-parse'
|
||||
import { createWorker } from 'tesseract.js'
|
||||
import { fromBuffer } from 'pdf2pic'
|
||||
import JSZip from 'jszip'
|
||||
import * as cheerio from 'cheerio'
|
||||
import { OllamaService } from './ollama_service.js'
|
||||
import { SERVICE_NAMES } from '../../constants/service_names.js'
|
||||
import { removeStopwords } from 'stopword'
|
||||
|
|
@ -25,18 +23,15 @@ export class RagService {
|
|||
private qdrant: QdrantClient | null = null
|
||||
private qdrantInitPromise: Promise<void> | null = null
|
||||
private embeddingModelVerified = false
|
||||
private resolvedEmbeddingModel: string | null = null
|
||||
public static UPLOADS_STORAGE_PATH = 'storage/kb_uploads'
|
||||
public static CONTENT_COLLECTION_NAME = 'nomad_knowledge_base'
|
||||
public static EMBEDDING_MODEL = 'nomic-embed-text:v1.5'
|
||||
public static EMBEDDING_DIMENSION = 768 // Nomic Embed Text v1.5 dimension is 768
|
||||
public static MODEL_CONTEXT_LENGTH = 2048 // nomic-embed-text has 2K token context
|
||||
public static MAX_SAFE_TOKENS = 1600 // Leave buffer for prefix and tokenization variance
|
||||
public static TARGET_TOKENS_PER_CHUNK = 1500 // Target 1500 tokens per chunk for embedding
|
||||
public static MAX_SAFE_TOKENS = 1800 // Leave buffer for prefix and tokenization variance
|
||||
public static TARGET_TOKENS_PER_CHUNK = 1700 // Target 1700 tokens per chunk for embedding
|
||||
public static PREFIX_TOKEN_BUDGET = 10 // Reserve ~10 tokens for prefixes
|
||||
public static CHAR_TO_TOKEN_RATIO = 2 // Conservative chars-per-token estimate; technical docs
|
||||
// (numbers, symbols, abbreviations) tokenize denser
|
||||
// than plain prose (~3), so 2 avoids context overflows
|
||||
public static CHAR_TO_TOKEN_RATIO = 3 // Approximate chars per token
|
||||
// Nomic Embed Text v1.5 uses task-specific prefixes for optimal performance
|
||||
public static SEARCH_DOCUMENT_PREFIX = 'search_document: '
|
||||
public static SEARCH_QUERY_PREFIX = 'search_query: '
|
||||
|
|
@ -250,9 +245,7 @@ export class RagService {
|
|||
|
||||
if (!this.embeddingModelVerified) {
|
||||
const allModels = await this.ollamaService.getModels(true)
|
||||
const embeddingModel =
|
||||
allModels.find((model) => model.name === RagService.EMBEDDING_MODEL) ??
|
||||
allModels.find((model) => model.name.toLowerCase().includes('nomic-embed-text'))
|
||||
const embeddingModel = allModels.find((model) => model.name === RagService.EMBEDDING_MODEL)
|
||||
|
||||
if (!embeddingModel) {
|
||||
try {
|
||||
|
|
@ -269,7 +262,6 @@ export class RagService {
|
|||
return null
|
||||
}
|
||||
}
|
||||
this.resolvedEmbeddingModel = embeddingModel?.name ?? RagService.EMBEDDING_MODEL
|
||||
this.embeddingModelVerified = true
|
||||
}
|
||||
|
||||
|
|
@ -293,6 +285,8 @@ export class RagService {
|
|||
// Extract text from chunk results
|
||||
const chunks = chunkResults.map((chunk) => chunk.text)
|
||||
|
||||
const ollamaClient = await this.ollamaService.getClient()
|
||||
|
||||
// Prepare all chunk texts with prefix and truncation
|
||||
const prefixedChunks: string[] = []
|
||||
for (let i = 0; i < chunks.length; i++) {
|
||||
|
|
@ -326,7 +320,10 @@ export class RagService {
|
|||
|
||||
logger.debug(`[RAG] Embedding batch ${batchIdx + 1}/${totalBatches} (${batch.length} chunks)`)
|
||||
|
||||
const response = await this.ollamaService.embed(this.resolvedEmbeddingModel ?? RagService.EMBEDDING_MODEL, batch)
|
||||
const response = await ollamaClient.embed({
|
||||
model: RagService.EMBEDDING_MODEL,
|
||||
input: batch,
|
||||
})
|
||||
|
||||
embeddings.push(...response.embeddings)
|
||||
|
||||
|
|
@ -567,86 +564,6 @@ export class RagService {
|
|||
return await this.extractTXTText(fileBuffer)
|
||||
}
|
||||
|
||||
/**
|
||||
* Extract text content from an EPUB file.
|
||||
* EPUBs are ZIP archives containing XHTML content files.
|
||||
* Reads the OPF manifest to determine reading order, then extracts
|
||||
* text from each content document in sequence.
|
||||
*/
|
||||
private async processEPUBFile(fileBuffer: Buffer): Promise<string> {
|
||||
const zip = await JSZip.loadAsync(fileBuffer)
|
||||
|
||||
// Read container.xml to find the OPF file path
|
||||
const containerXml = await zip.file('META-INF/container.xml')?.async('text')
|
||||
if (!containerXml) {
|
||||
throw new Error('Invalid EPUB: missing META-INF/container.xml')
|
||||
}
|
||||
|
||||
// Parse container.xml to get the OPF rootfile path
|
||||
const $container = cheerio.load(containerXml, { xml: true })
|
||||
const opfPath = $container('rootfile').attr('full-path')
|
||||
if (!opfPath) {
|
||||
throw new Error('Invalid EPUB: no rootfile found in container.xml')
|
||||
}
|
||||
|
||||
// Determine the base directory of the OPF file for resolving relative paths
|
||||
const opfDir = opfPath.includes('/') ? opfPath.substring(0, opfPath.lastIndexOf('/') + 1) : ''
|
||||
|
||||
// Read and parse the OPF file
|
||||
const opfContent = await zip.file(opfPath)?.async('text')
|
||||
if (!opfContent) {
|
||||
throw new Error(`Invalid EPUB: OPF file not found at ${opfPath}`)
|
||||
}
|
||||
|
||||
const $opf = cheerio.load(opfContent, { xml: true })
|
||||
|
||||
// Build a map of manifest items (id -> href)
|
||||
const manifestItems = new Map<string, string>()
|
||||
$opf('manifest item').each((_, el) => {
|
||||
const id = $opf(el).attr('id')
|
||||
const href = $opf(el).attr('href')
|
||||
const mediaType = $opf(el).attr('media-type') || ''
|
||||
// Only include XHTML/HTML content documents
|
||||
if (id && href && (mediaType.includes('html') || mediaType.includes('xml'))) {
|
||||
manifestItems.set(id, href)
|
||||
}
|
||||
})
|
||||
|
||||
// Get the reading order from the spine
|
||||
const spineOrder: string[] = []
|
||||
$opf('spine itemref').each((_, el) => {
|
||||
const idref = $opf(el).attr('idref')
|
||||
if (idref && manifestItems.has(idref)) {
|
||||
spineOrder.push(manifestItems.get(idref)!)
|
||||
}
|
||||
})
|
||||
|
||||
// If no spine found, fall back to all manifest items
|
||||
const contentFiles = spineOrder.length > 0
|
||||
? spineOrder
|
||||
: Array.from(manifestItems.values())
|
||||
|
||||
// Extract text from each content file in order
|
||||
const textParts: string[] = []
|
||||
for (const href of contentFiles) {
|
||||
const fullPath = opfDir + href
|
||||
const content = await zip.file(fullPath)?.async('text')
|
||||
if (content) {
|
||||
const $ = cheerio.load(content)
|
||||
// Remove script and style elements
|
||||
$('script, style').remove()
|
||||
const text = $('body').text().trim()
|
||||
if (text) {
|
||||
textParts.push(text)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
const fullText = textParts.join('\n\n')
|
||||
logger.debug(`[RAG] EPUB extracted ${textParts.length} chapters, ${fullText.length} characters total`)
|
||||
return fullText
|
||||
}
|
||||
|
||||
private async embedTextAndCleanup(
|
||||
extractedText: string,
|
||||
filepath: string,
|
||||
|
|
@ -721,9 +638,6 @@ export class RagService {
|
|||
case 'pdf':
|
||||
extractedText = await this.processPDFFile(fileBuffer!)
|
||||
break
|
||||
case 'epub':
|
||||
extractedText = await this.processEPUBFile(fileBuffer!)
|
||||
break
|
||||
case 'text':
|
||||
default:
|
||||
extractedText = await this.processTextFile(fileBuffer!)
|
||||
|
|
@ -778,9 +692,7 @@ export class RagService {
|
|||
|
||||
if (!this.embeddingModelVerified) {
|
||||
const allModels = await this.ollamaService.getModels(true)
|
||||
const embeddingModel =
|
||||
allModels.find((model) => model.name === RagService.EMBEDDING_MODEL) ??
|
||||
allModels.find((model) => model.name.toLowerCase().includes('nomic-embed-text'))
|
||||
const embeddingModel = allModels.find((model) => model.name === RagService.EMBEDDING_MODEL)
|
||||
|
||||
if (!embeddingModel) {
|
||||
logger.warn(
|
||||
|
|
@ -789,7 +701,6 @@ export class RagService {
|
|||
this.embeddingModelVerified = false
|
||||
return []
|
||||
}
|
||||
this.resolvedEmbeddingModel = embeddingModel.name
|
||||
this.embeddingModelVerified = true
|
||||
}
|
||||
|
||||
|
|
@ -799,6 +710,8 @@ export class RagService {
|
|||
logger.debug(`[RAG] Extracted keywords: [${keywords.join(', ')}]`)
|
||||
|
||||
// Generate embedding for the query with search_query prefix
|
||||
const ollamaClient = await this.ollamaService.getClient()
|
||||
|
||||
// Ensure query doesn't exceed token limit
|
||||
const prefixTokens = this.estimateTokenCount(RagService.SEARCH_QUERY_PREFIX)
|
||||
const maxQueryTokens = RagService.MAX_SAFE_TOKENS - prefixTokens
|
||||
|
|
@ -816,7 +729,10 @@ export class RagService {
|
|||
return []
|
||||
}
|
||||
|
||||
const response = await this.ollamaService.embed(this.resolvedEmbeddingModel ?? RagService.EMBEDDING_MODEL, [prefixedQuery])
|
||||
const response = await ollamaClient.embed({
|
||||
model: RagService.EMBEDDING_MODEL,
|
||||
input: [prefixedQuery],
|
||||
})
|
||||
|
||||
// Perform semantic search with a higher limit to enable reranking
|
||||
const searchLimit = limit * 3 // Get more results for reranking
|
||||
|
|
|
|||
|
|
@ -4,22 +4,17 @@ import { DockerService } from '#services/docker_service'
|
|||
import { ServiceSlim } from '../../types/services.js'
|
||||
import logger from '@adonisjs/core/services/logger'
|
||||
import si from 'systeminformation'
|
||||
import {
|
||||
GpuHealthStatus,
|
||||
NomadDiskInfo,
|
||||
NomadDiskInfoRaw,
|
||||
SystemInformationResponse,
|
||||
} from '../../types/system.js'
|
||||
import { GpuHealthStatus, NomadDiskInfo, NomadDiskInfoRaw, SystemInformationResponse } from '../../types/system.js'
|
||||
import { SERVICE_NAMES } from '../../constants/service_names.js'
|
||||
import { readFileSync } from 'node:fs'
|
||||
import path, { join } from 'node:path'
|
||||
import { readFileSync } from 'fs'
|
||||
import path, { join } from 'path'
|
||||
import { getAllFilesystems, getFile } from '../utils/fs.js'
|
||||
import axios from 'axios'
|
||||
import env from '#start/env'
|
||||
import KVStore from '#models/kv_store'
|
||||
import { KV_STORE_SCHEMA, KVStoreKey } from '../../types/kv_store.js'
|
||||
import { isNewerVersion } from '../utils/version.js'
|
||||
import { invalidateAssistantNameCache } from '../../config/inertia.js'
|
||||
|
||||
|
||||
@inject()
|
||||
export class SystemService {
|
||||
|
|
@ -29,8 +24,8 @@ export class SystemService {
|
|||
constructor(private dockerService: DockerService) { }
|
||||
|
||||
async checkServiceInstalled(serviceName: string): Promise<boolean> {
|
||||
const services = await this.getServices({ installedOnly: true })
|
||||
return services.some((service) => service.service_name === serviceName)
|
||||
const services = await this.getServices({ installedOnly: true });
|
||||
return services.some(service => service.service_name === serviceName);
|
||||
}
|
||||
|
||||
async getInternetStatus(): Promise<boolean> {
|
||||
|
|
@ -72,20 +67,14 @@ export class SystemService {
|
|||
return false
|
||||
}
|
||||
|
||||
async getNvidiaSmiInfo(): Promise<
|
||||
| Array<{ vendor: string; model: string; vram: number }>
|
||||
| { error: string }
|
||||
| 'OLLAMA_NOT_FOUND'
|
||||
| 'BAD_RESPONSE'
|
||||
| 'UNKNOWN_ERROR'
|
||||
> {
|
||||
async getNvidiaSmiInfo(): Promise<Array<{ vendor: string; model: string; vram: number; }> | { error: string } | 'OLLAMA_NOT_FOUND' | 'BAD_RESPONSE' | 'UNKNOWN_ERROR'> {
|
||||
try {
|
||||
const containers = await this.dockerService.docker.listContainers({ all: false })
|
||||
const ollamaContainer = containers.find((c) => c.Names.includes(`/${SERVICE_NAMES.OLLAMA}`))
|
||||
if (!ollamaContainer) {
|
||||
logger.info(
|
||||
'Ollama container not found for nvidia-smi info retrieval. This is expected if Ollama is not installed.'
|
||||
const ollamaContainer = containers.find((c) =>
|
||||
c.Names.includes(`/${SERVICE_NAMES.OLLAMA}`)
|
||||
)
|
||||
if (!ollamaContainer) {
|
||||
logger.info('Ollama container not found for nvidia-smi info retrieval. This is expected if Ollama is not installed.')
|
||||
return 'OLLAMA_NOT_FOUND'
|
||||
}
|
||||
|
||||
|
|
@ -103,35 +92,23 @@ export class SystemService {
|
|||
const output = await new Promise<string>((resolve) => {
|
||||
let data = ''
|
||||
const timeout = setTimeout(() => resolve(data), 5000)
|
||||
stream.on('data', (chunk: Buffer) => {
|
||||
data += chunk.toString()
|
||||
})
|
||||
stream.on('end', () => {
|
||||
clearTimeout(timeout)
|
||||
resolve(data)
|
||||
})
|
||||
stream.on('data', (chunk: Buffer) => { data += chunk.toString() })
|
||||
stream.on('end', () => { clearTimeout(timeout); resolve(data) })
|
||||
})
|
||||
|
||||
// Remove any non-printable characters and trim the output
|
||||
const cleaned = Array.from(output)
|
||||
.filter((character) => character.charCodeAt(0) > 8)
|
||||
.join('')
|
||||
.trim()
|
||||
if (
|
||||
cleaned &&
|
||||
!cleaned.toLowerCase().includes('error') &&
|
||||
!cleaned.toLowerCase().includes('not found')
|
||||
) {
|
||||
const cleaned = output.replace(/[\x00-\x08]/g, '').trim()
|
||||
if (cleaned && !cleaned.toLowerCase().includes('error') && !cleaned.toLowerCase().includes('not found')) {
|
||||
// Split by newlines to handle multiple GPUs installed
|
||||
const lines = cleaned.split('\n').filter((line) => line.trim())
|
||||
const lines = cleaned.split('\n').filter(line => line.trim())
|
||||
|
||||
// Map each line out to a useful structure for us
|
||||
const gpus = lines.map((line) => {
|
||||
const gpus = lines.map(line => {
|
||||
const parts = line.split(',').map((s) => s.trim())
|
||||
return {
|
||||
vendor: 'NVIDIA',
|
||||
model: parts[0] || 'NVIDIA GPU',
|
||||
vram: parts[1] ? Number.parseInt(parts[1], 10) : 0,
|
||||
vram: parts[1] ? parseInt(parts[1], 10) : 0,
|
||||
}
|
||||
})
|
||||
|
||||
|
|
@ -140,7 +117,8 @@ export class SystemService {
|
|||
|
||||
// If we got output but looks like an error, consider it a bad response from nvidia-smi
|
||||
return 'BAD_RESPONSE'
|
||||
} catch (error) {
|
||||
}
|
||||
catch (error) {
|
||||
logger.error('Error getting nvidia-smi info:', error)
|
||||
if (error instanceof Error && error.message) {
|
||||
return { error: error.message }
|
||||
|
|
@ -149,65 +127,8 @@ export class SystemService {
|
|||
}
|
||||
}
|
||||
|
||||
async getExternalOllamaGpuInfo(): Promise<Array<{
|
||||
vendor: string
|
||||
model: string
|
||||
vram: number
|
||||
}> | null> {
|
||||
try {
|
||||
// If a remote Ollama URL is configured, use it directly without requiring a local container
|
||||
const remoteOllamaUrl = await KVStore.getValue('ai.remoteOllamaUrl')
|
||||
if (!remoteOllamaUrl) {
|
||||
const containers = await this.dockerService.docker.listContainers({ all: false })
|
||||
const ollamaContainer = containers.find((c) => c.Names.includes(`/${SERVICE_NAMES.OLLAMA}`))
|
||||
if (!ollamaContainer) {
|
||||
return null
|
||||
}
|
||||
|
||||
const actualImage = (ollamaContainer.Image || '').toLowerCase()
|
||||
if (actualImage.includes('ollama/ollama') || actualImage.startsWith('ollama:')) {
|
||||
return null
|
||||
}
|
||||
}
|
||||
|
||||
const ollamaUrl = remoteOllamaUrl || (await this.dockerService.getServiceURL(SERVICE_NAMES.OLLAMA))
|
||||
if (!ollamaUrl) {
|
||||
return null
|
||||
}
|
||||
|
||||
await axios.get(new URL('/api/tags', ollamaUrl).toString(), { timeout: 3000 })
|
||||
|
||||
let vramMb = 0
|
||||
try {
|
||||
const psResponse = await axios.get(new URL('/api/ps', ollamaUrl).toString(), {
|
||||
timeout: 3000,
|
||||
})
|
||||
const loadedModels = Array.isArray(psResponse.data?.models) ? psResponse.data.models : []
|
||||
const largestAllocation = loadedModels.reduce(
|
||||
(max: number, model: { size_vram?: number | string }) =>
|
||||
Math.max(max, Number(model.size_vram) || 0),
|
||||
0
|
||||
)
|
||||
vramMb = largestAllocation > 0 ? Math.round(largestAllocation / (1024 * 1024)) : 0
|
||||
} catch {}
|
||||
|
||||
return [
|
||||
{
|
||||
vendor: 'NVIDIA',
|
||||
model: 'NVIDIA GPU (external Ollama)',
|
||||
vram: vramMb,
|
||||
},
|
||||
]
|
||||
} catch (error) {
|
||||
logger.info(
|
||||
`[SystemService] External Ollama GPU probe failed: ${error instanceof Error ? error.message : error}`
|
||||
)
|
||||
return null
|
||||
}
|
||||
}
|
||||
|
||||
async getServices({ installedOnly = true }: { installedOnly?: boolean }): Promise<ServiceSlim[]> {
|
||||
const statuses = await this._syncContainersWithDatabase() // Sync and reuse the fetched status list
|
||||
await this._syncContainersWithDatabase() // Sync up before fetching to ensure we have the latest status
|
||||
|
||||
const query = Service.query()
|
||||
.orderBy('display_order', 'asc')
|
||||
|
|
@ -236,6 +157,8 @@ export class SystemService {
|
|||
return []
|
||||
}
|
||||
|
||||
const statuses = await this.dockerService.getServicesStatus()
|
||||
|
||||
const toReturn: ServiceSlim[] = []
|
||||
|
||||
for (const service of services) {
|
||||
|
|
@ -350,46 +273,17 @@ export class SystemService {
|
|||
graphics.controllers = nvidiaInfo.map((gpu) => ({
|
||||
model: gpu.model,
|
||||
vendor: gpu.vendor,
|
||||
bus: '',
|
||||
bus: "",
|
||||
vram: gpu.vram,
|
||||
vramDynamic: false, // assume false here, we don't actually use this field for our purposes.
|
||||
}))
|
||||
gpuHealth.status = 'ok'
|
||||
gpuHealth.ollamaGpuAccessible = true
|
||||
} else if (nvidiaInfo === 'OLLAMA_NOT_FOUND') {
|
||||
// No local Ollama container — check if a remote Ollama URL is configured
|
||||
const externalOllamaGpu = await this.getExternalOllamaGpuInfo()
|
||||
if (externalOllamaGpu) {
|
||||
graphics.controllers = externalOllamaGpu.map((gpu) => ({
|
||||
model: gpu.model,
|
||||
vendor: gpu.vendor,
|
||||
bus: '',
|
||||
vram: gpu.vram,
|
||||
vramDynamic: false,
|
||||
}))
|
||||
gpuHealth.status = 'ok'
|
||||
gpuHealth.ollamaGpuAccessible = true
|
||||
} else {
|
||||
gpuHealth.status = 'ollama_not_installed'
|
||||
}
|
||||
} else {
|
||||
const externalOllamaGpu = await this.getExternalOllamaGpuInfo()
|
||||
if (externalOllamaGpu) {
|
||||
graphics.controllers = externalOllamaGpu.map((gpu) => ({
|
||||
model: gpu.model,
|
||||
vendor: gpu.vendor,
|
||||
bus: '',
|
||||
vram: gpu.vram,
|
||||
vramDynamic: false,
|
||||
}))
|
||||
gpuHealth.status = 'ok'
|
||||
gpuHealth.ollamaGpuAccessible = true
|
||||
} else {
|
||||
gpuHealth.status = 'passthrough_failed'
|
||||
logger.warn(
|
||||
`NVIDIA runtime detected but GPU passthrough failed: ${typeof nvidiaInfo === 'string' ? nvidiaInfo : JSON.stringify(nvidiaInfo)}`
|
||||
)
|
||||
}
|
||||
logger.warn(`NVIDIA runtime detected but GPU passthrough failed: ${typeof nvidiaInfo === 'string' ? nvidiaInfo : JSON.stringify(nvidiaInfo)}`)
|
||||
}
|
||||
}
|
||||
} else {
|
||||
|
|
@ -462,8 +356,7 @@ export class SystemService {
|
|||
|
||||
logger.info(`Current version: ${currentVersion}, Latest version: ${latestVersion}`)
|
||||
|
||||
const updateAvailable =
|
||||
process.env.NODE_ENV === 'development'
|
||||
const updateAvailable = process.env.NODE_ENV === 'development'
|
||||
? false
|
||||
: isNewerVersion(latestVersion, currentVersion.trim(), earlyAccess)
|
||||
|
||||
|
|
@ -625,21 +518,15 @@ export class SystemService {
|
|||
const k = 1024
|
||||
const sizes = ['Bytes', 'KB', 'MB', 'GB', 'TB']
|
||||
const i = Math.floor(Math.log(bytes) / Math.log(k))
|
||||
return Number.parseFloat((bytes / Math.pow(k, i)).toFixed(decimals)) + ' ' + sizes[i]
|
||||
return parseFloat((bytes / Math.pow(k, i)).toFixed(decimals)) + ' ' + sizes[i]
|
||||
}
|
||||
|
||||
async updateSetting(key: KVStoreKey, value: any): Promise<void> {
|
||||
if (
|
||||
(value === '' || value === undefined || value === null) &&
|
||||
KV_STORE_SCHEMA[key] === 'string'
|
||||
) {
|
||||
if ((value === '' || value === undefined || value === null) && KV_STORE_SCHEMA[key] === 'string') {
|
||||
await KVStore.clearValue(key)
|
||||
} else {
|
||||
await KVStore.setValue(key, value)
|
||||
}
|
||||
if (key === 'ai.assistantCustomName') {
|
||||
invalidateAssistantNameCache()
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
|
|
@ -647,9 +534,8 @@ export class SystemService {
|
|||
* It will mark services as not installed if their corresponding containers do not exist, regardless of their running state.
|
||||
* Handles cases where a container might have been manually removed, ensuring the database reflects the actual existence of containers.
|
||||
* Containers that exist but are stopped, paused, or restarting will still be considered installed.
|
||||
* Returns the fetched service status list so callers can reuse it without a second Docker API call.
|
||||
*/
|
||||
private async _syncContainersWithDatabase(): Promise<{ service_name: string; status: string }[]> {
|
||||
private async _syncContainersWithDatabase() {
|
||||
try {
|
||||
const allServices = await Service.all()
|
||||
const serviceStatusList = await this.dockerService.getServicesStatus()
|
||||
|
|
@ -662,11 +548,6 @@ export class SystemService {
|
|||
if (service.installed) {
|
||||
// If marked as installed but container doesn't exist, mark as not installed
|
||||
if (!containerExists) {
|
||||
// Exception: remote Ollama is configured without a local container — don't reset it
|
||||
if (service.service_name === SERVICE_NAMES.OLLAMA) {
|
||||
const remoteUrl = await KVStore.getValue('ai.remoteOllamaUrl')
|
||||
if (remoteUrl) continue
|
||||
}
|
||||
logger.warn(
|
||||
`Service ${service.service_name} is marked as installed but container does not exist. Marking as not installed.`
|
||||
)
|
||||
|
|
@ -686,11 +567,8 @@ export class SystemService {
|
|||
}
|
||||
}
|
||||
}
|
||||
|
||||
return serviceStatusList
|
||||
} catch (error) {
|
||||
logger.error('Error syncing containers with database:', error)
|
||||
return []
|
||||
}
|
||||
}
|
||||
|
||||
|
|
@ -742,4 +620,5 @@ export class SystemService {
|
|||
}
|
||||
})
|
||||
}
|
||||
|
||||
}
|
||||
|
|
|
|||
|
|
@ -25,7 +25,6 @@ import InstalledResource from '#models/installed_resource'
|
|||
import { RunDownloadJob } from '#jobs/run_download_job'
|
||||
import { SERVICE_NAMES } from '../../constants/service_names.js'
|
||||
import { CollectionManifestService } from './collection_manifest_service.js'
|
||||
import { KiwixLibraryService } from './kiwix_library_service.js'
|
||||
import type { CategoryWithStatus } from '../../types/collections.js'
|
||||
|
||||
const ZIM_MIME_TYPES = ['application/x-zim', 'application/x-openzim', 'application/octet-stream']
|
||||
|
|
@ -138,13 +137,13 @@ export class ZimService {
|
|||
}
|
||||
}
|
||||
|
||||
async downloadRemote(url: string, metadata?: { title?: string; summary?: string; author?: string; size_bytes?: number }): Promise<{ filename: string; jobId?: string }> {
|
||||
async downloadRemote(url: string): Promise<{ filename: string; jobId?: string }> {
|
||||
const parsed = new URL(url)
|
||||
if (!parsed.pathname.endsWith('.zim')) {
|
||||
throw new Error(`Invalid ZIM file URL: ${url}. URL must end with .zim`)
|
||||
}
|
||||
|
||||
const existing = await RunDownloadJob.getActiveByUrl(url)
|
||||
const existing = await RunDownloadJob.getByUrl(url)
|
||||
if (existing) {
|
||||
throw new Error('A download for this URL is already in progress')
|
||||
}
|
||||
|
|
@ -171,8 +170,6 @@ export class ZimService {
|
|||
allowedMimeTypes: ZIM_MIME_TYPES,
|
||||
forceNew: true,
|
||||
filetype: 'zim',
|
||||
title: metadata?.title,
|
||||
totalBytes: metadata?.size_bytes,
|
||||
resourceMetadata,
|
||||
})
|
||||
|
||||
|
|
@ -222,7 +219,7 @@ export class ZimService {
|
|||
const downloadFilenames: string[] = []
|
||||
|
||||
for (const resource of toDownload) {
|
||||
const existingJob = await RunDownloadJob.getActiveByUrl(resource.url)
|
||||
const existingJob = await RunDownloadJob.getByUrl(resource.url)
|
||||
if (existingJob) {
|
||||
logger.warn(`[ZimService] Download already in progress for ${resource.url}, skipping.`)
|
||||
continue
|
||||
|
|
@ -241,8 +238,6 @@ export class ZimService {
|
|||
allowedMimeTypes: ZIM_MIME_TYPES,
|
||||
forceNew: true,
|
||||
filetype: 'zim',
|
||||
title: (resource as any).title || undefined,
|
||||
totalBytes: (resource as any).size_mb ? (resource as any).size_mb * 1024 * 1024 : undefined,
|
||||
resourceMetadata: {
|
||||
resource_id: resource.id,
|
||||
version: resource.version,
|
||||
|
|
@ -262,17 +257,6 @@ export class ZimService {
|
|||
}
|
||||
}
|
||||
|
||||
// Update the kiwix library XML after all downloaded ZIM files are in place.
|
||||
// This covers all ZIM types including Wikipedia. Rebuilding once from disk
|
||||
// avoids repeated XML parse/write cycles and reduces the chance of write races
|
||||
// when multiple download jobs complete concurrently.
|
||||
const kiwixLibraryService = new KiwixLibraryService()
|
||||
try {
|
||||
await kiwixLibraryService.rebuildFromDisk()
|
||||
} catch (err) {
|
||||
logger.error('[ZimService] Failed to rebuild kiwix library from disk:', err)
|
||||
}
|
||||
|
||||
if (restart) {
|
||||
// Check if there are any remaining ZIM download jobs before restarting
|
||||
const { QueueService } = await import('./queue_service.js')
|
||||
|
|
@ -288,9 +272,7 @@ export class ZimService {
|
|||
// Filter out completed jobs (progress === 100) to avoid race condition
|
||||
// where this job itself is still in the active queue
|
||||
const activeIncompleteJobs = activeJobs.filter((job) => {
|
||||
const progress = typeof job.progress === 'object' && job.progress !== null
|
||||
? (job.progress as any).percent
|
||||
: typeof job.progress === 'number' ? job.progress : 0
|
||||
const progress = typeof job.progress === 'number' ? job.progress : 0
|
||||
return progress < 100
|
||||
})
|
||||
|
||||
|
|
@ -301,22 +283,15 @@ export class ZimService {
|
|||
if (hasRemainingZimJobs) {
|
||||
logger.info('[ZimService] Skipping container restart - more ZIM downloads pending')
|
||||
} else {
|
||||
// If kiwix is already running in library mode, --monitorLibrary will pick up
|
||||
// the XML change automatically — no restart needed.
|
||||
const isLegacy = await this.dockerService.isKiwixOnLegacyConfig()
|
||||
if (!isLegacy) {
|
||||
logger.info('[ZimService] Kiwix is in library mode — XML updated, no container restart needed.')
|
||||
} else {
|
||||
// Legacy config: restart (affectContainer will trigger migration instead)
|
||||
// Restart KIWIX container to pick up new ZIM file
|
||||
logger.info('[ZimService] No more ZIM downloads pending - restarting KIWIX container')
|
||||
await this.dockerService
|
||||
.affectContainer(SERVICE_NAMES.KIWIX, 'restart')
|
||||
.catch((error) => {
|
||||
logger.error(`[ZimService] Failed to restart KIWIX container:`, error)
|
||||
logger.error(`[ZimService] Failed to restart KIWIX container:`, error) // Don't stop the download completion, just log the error.
|
||||
})
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Create InstalledResource entries for downloaded files
|
||||
for (const url of urls) {
|
||||
|
|
@ -372,12 +347,6 @@ export class ZimService {
|
|||
|
||||
await deleteFileIfExists(fullPath)
|
||||
|
||||
// Remove from kiwix library XML so --monitorLibrary stops serving the deleted file
|
||||
const kiwixLibraryService = new KiwixLibraryService()
|
||||
await kiwixLibraryService.removeBook(fileName).catch((err) => {
|
||||
logger.error(`[ZimService] Failed to remove ${fileName} from kiwix library:`, err)
|
||||
})
|
||||
|
||||
// Clean up InstalledResource entry
|
||||
const parsed = CollectionManifestService.parseZimFilename(fileName)
|
||||
if (parsed) {
|
||||
|
|
@ -489,7 +458,7 @@ export class ZimService {
|
|||
}
|
||||
|
||||
// Check if already downloading
|
||||
const existingJob = await RunDownloadJob.getActiveByUrl(selectedOption.url)
|
||||
const existingJob = await RunDownloadJob.getByUrl(selectedOption.url)
|
||||
if (existingJob) {
|
||||
return { success: false, message: 'Download already in progress' }
|
||||
}
|
||||
|
|
@ -528,8 +497,6 @@ export class ZimService {
|
|||
allowedMimeTypes: ZIM_MIME_TYPES,
|
||||
forceNew: true,
|
||||
filetype: 'zim',
|
||||
title: selectedOption.name,
|
||||
totalBytes: selectedOption.size_mb ? selectedOption.size_mb * 1024 * 1024 : undefined,
|
||||
})
|
||||
|
||||
if (!result || !result.job) {
|
||||
|
|
|
|||
|
|
@ -5,7 +5,6 @@ import { createReadStream } from 'fs'
|
|||
import { LSBlockDevice, NomadDiskInfoRaw } from '../../types/system.js'
|
||||
|
||||
export const ZIM_STORAGE_PATH = '/storage/zim'
|
||||
export const KIWIX_LIBRARY_XML_PATH = '/storage/zim/kiwix-library.xml'
|
||||
|
||||
export async function listDirectoryContents(path: string): Promise<FileEntry[]> {
|
||||
const entries = await readdir(path, { withFileTypes: true })
|
||||
|
|
@ -50,7 +49,7 @@ export async function listDirectoryContentsRecursive(path: string): Promise<File
|
|||
export async function ensureDirectoryExists(path: string): Promise<void> {
|
||||
try {
|
||||
await stat(path)
|
||||
} catch (error: any) {
|
||||
} catch (error) {
|
||||
if (error.code === 'ENOENT') {
|
||||
await mkdir(path, { recursive: true })
|
||||
}
|
||||
|
|
@ -74,7 +73,7 @@ export async function getFile(
|
|||
return createReadStream(path)
|
||||
}
|
||||
return await readFile(path)
|
||||
} catch (error: any) {
|
||||
} catch (error) {
|
||||
if (error.code === 'ENOENT') {
|
||||
return null
|
||||
}
|
||||
|
|
@ -91,7 +90,7 @@ export async function getFileStatsIfExists(
|
|||
size: stats.size,
|
||||
modifiedTime: stats.mtime,
|
||||
}
|
||||
} catch (error: any) {
|
||||
} catch (error) {
|
||||
if (error.code === 'ENOENT') {
|
||||
return null
|
||||
}
|
||||
|
|
@ -102,7 +101,7 @@ export async function getFileStatsIfExists(
|
|||
export async function deleteFileIfExists(path: string): Promise<void> {
|
||||
try {
|
||||
await unlink(path)
|
||||
} catch (error: any) {
|
||||
} catch (error) {
|
||||
if (error.code !== 'ENOENT') {
|
||||
throw error
|
||||
}
|
||||
|
|
@ -152,7 +151,7 @@ export function matchesDevice(fsPath: string, deviceName: string): boolean {
|
|||
return false
|
||||
}
|
||||
|
||||
export function determineFileType(filename: string): 'image' | 'pdf' | 'text' | 'epub' | 'zim' | 'unknown' {
|
||||
export function determineFileType(filename: string): 'image' | 'pdf' | 'text' | 'zim' | 'unknown' {
|
||||
const ext = path.extname(filename).toLowerCase()
|
||||
if (['.jpg', '.jpeg', '.png', '.gif', '.bmp', '.tiff', '.webp'].includes(ext)) {
|
||||
return 'image'
|
||||
|
|
@ -160,8 +159,6 @@ export function determineFileType(filename: string): 'image' | 'pdf' | 'text' |
|
|||
return 'pdf'
|
||||
} else if (['.txt', '.md', '.docx', '.rtf'].includes(ext)) {
|
||||
return 'text'
|
||||
} else if (ext === '.epub') {
|
||||
return 'epub'
|
||||
} else if (ext === '.zim') {
|
||||
return 'zim'
|
||||
} else {
|
||||
|
|
|
|||
|
|
@ -22,8 +22,6 @@ export function assertNotPrivateUrl(urlString: string): void {
|
|||
/^169\.254\.\d+\.\d+$/, // Link-local / cloud metadata
|
||||
/^\[::1\]$/,
|
||||
/^\[?fe80:/i, // IPv6 link-local
|
||||
/^\[::ffff:/i, // IPv4-mapped IPv6 (e.g. [::ffff:7f00:1] = 127.0.0.1)
|
||||
/^\[::\]$/, // IPv6 all-zeros (equivalent to 0.0.0.0)
|
||||
]
|
||||
|
||||
if (blockedPatterns.some((re) => re.test(hostname))) {
|
||||
|
|
|
|||
|
|
@ -1,9 +1,6 @@
|
|||
import vine from "@vinejs/vine";
|
||||
import { SETTINGS_KEYS } from "../../constants/kv_store.js";
|
||||
|
||||
export const getSettingSchema = vine.compile(vine.object({
|
||||
key: vine.enum(SETTINGS_KEYS),
|
||||
}))
|
||||
|
||||
export const updateSettingSchema = vine.compile(vine.object({
|
||||
key: vine.enum(SETTINGS_KEYS),
|
||||
|
|
|
|||
|
|
@ -61,17 +61,10 @@ export default class QueueWork extends BaseCommand {
|
|||
{
|
||||
connection: queueConfig.connection,
|
||||
concurrency: this.getConcurrencyForQueue(queueName),
|
||||
lockDuration: 300000,
|
||||
autorun: true,
|
||||
}
|
||||
)
|
||||
|
||||
// Required to prevent Node from treating BullMQ internal errors as unhandled
|
||||
// EventEmitter errors that crash the process.
|
||||
worker.on('error', (err) => {
|
||||
this.logger.error(`[${queueName}] Worker error: ${err.message}`)
|
||||
})
|
||||
|
||||
worker.on('failed', async (job, err) => {
|
||||
this.logger.error(`[${queueName}] Job failed: ${job?.id}, Error: ${err.message}`)
|
||||
|
||||
|
|
@ -103,15 +96,6 @@ export default class QueueWork extends BaseCommand {
|
|||
await CheckUpdateJob.scheduleNightly()
|
||||
await CheckServiceUpdatesJob.scheduleNightly()
|
||||
|
||||
// Safety net: log unhandled rejections instead of crashing the worker process.
|
||||
// Individual job errors are already caught by BullMQ; this catches anything that
|
||||
// escapes (e.g. a fire-and-forget promise in a callback that rejects unexpectedly).
|
||||
process.on('unhandledRejection', (reason) => {
|
||||
this.logger.error(
|
||||
`Unhandled promise rejection in worker process: ${reason instanceof Error ? reason.message : String(reason)}`
|
||||
)
|
||||
})
|
||||
|
||||
// Graceful shutdown for all workers
|
||||
process.on('SIGTERM', async () => {
|
||||
this.logger.info('SIGTERM received. Shutting down workers...')
|
||||
|
|
|
|||
|
|
@ -47,7 +47,7 @@ const bodyParserConfig = defineConfig({
|
|||
* Maximum limit of data to parse including all files
|
||||
* and fields
|
||||
*/
|
||||
limit: '110mb', // Set to 110MB to allow for some overhead beyond the 100MB file size limit
|
||||
limit: '20mb',
|
||||
types: ['multipart/form-data'],
|
||||
},
|
||||
})
|
||||
|
|
|
|||
|
|
@ -13,12 +13,7 @@ const dbConfig = defineConfig({
|
|||
user: env.get('DB_USER'),
|
||||
password: env.get('DB_PASSWORD'),
|
||||
database: env.get('DB_DATABASE'),
|
||||
ssl: env.get('DB_SSL') ? {} : false,
|
||||
},
|
||||
pool: {
|
||||
min: 2,
|
||||
max: 15,
|
||||
acquireTimeoutMillis: 10000, // Fail fast (10s) instead of silently hanging for ~60s
|
||||
ssl: env.get('DB_SSL') ?? true, // Default to true
|
||||
},
|
||||
migrations: {
|
||||
naturalSort: true,
|
||||
|
|
|
|||
|
|
@ -3,12 +3,6 @@ import { SystemService } from '#services/system_service'
|
|||
import { defineConfig } from '@adonisjs/inertia'
|
||||
import type { InferSharedProps } from '@adonisjs/inertia/types'
|
||||
|
||||
let _assistantNameCache: { value: string; expiresAt: number } | null = null
|
||||
|
||||
export function invalidateAssistantNameCache() {
|
||||
_assistantNameCache = null
|
||||
}
|
||||
|
||||
const inertiaConfig = defineConfig({
|
||||
/**
|
||||
* Path to the Edge view that will be used as the root view for Inertia responses
|
||||
|
|
@ -22,14 +16,8 @@ const inertiaConfig = defineConfig({
|
|||
appVersion: () => SystemService.getAppVersion(),
|
||||
environment: process.env.NODE_ENV || 'production',
|
||||
aiAssistantName: async () => {
|
||||
const now = Date.now()
|
||||
if (_assistantNameCache && now < _assistantNameCache.expiresAt) {
|
||||
return _assistantNameCache.value
|
||||
}
|
||||
const customName = await KVStore.getValue('ai.assistantCustomName')
|
||||
const value = (customName && customName.trim()) ? customName : 'AI Assistant'
|
||||
_assistantNameCache = { value, expiresAt: now + 60_000 }
|
||||
return value
|
||||
return (customName && customName.trim()) ? customName : 'AI Assistant'
|
||||
},
|
||||
},
|
||||
|
||||
|
|
|
|||
|
|
@ -1,2 +0,0 @@
|
|||
|
||||
export const KIWIX_LIBRARY_CMD = '--library /data/kiwix-library.xml --monitorLibrary --address=all'
|
||||
|
|
@ -1,3 +1,3 @@
|
|||
import { KVStoreKey } from "../types/kv_store.js";
|
||||
|
||||
export const SETTINGS_KEYS: KVStoreKey[] = ['chat.suggestionsEnabled', 'chat.lastModel', 'ui.hasVisitedEasySetup', 'ui.theme', 'system.earlyAccess', 'ai.assistantCustomName', 'ai.remoteOllamaUrl', 'ai.ollamaFlashAttention'];
|
||||
export const SETTINGS_KEYS: KVStoreKey[] = ['chat.suggestionsEnabled', 'chat.lastModel', 'ui.hasVisitedEasySetup', 'ui.theme', 'system.earlyAccess', 'ai.assistantCustomName'];
|
||||
|
|
@ -1,29 +0,0 @@
|
|||
import { BaseSchema } from '@adonisjs/lucid/schema'
|
||||
|
||||
export default class extends BaseSchema {
|
||||
protected tableName = 'services'
|
||||
|
||||
async up() {
|
||||
this.defer(async (db) => {
|
||||
await db
|
||||
.from(this.tableName)
|
||||
.where('service_name', 'nomad_kiwix_server')
|
||||
.whereRaw('`container_command` LIKE ?', ['%*.zim%'])
|
||||
.update({
|
||||
container_command: '--library /data/kiwix-library.xml --monitorLibrary --address=all',
|
||||
})
|
||||
})
|
||||
}
|
||||
|
||||
async down() {
|
||||
this.defer(async (db) => {
|
||||
await db
|
||||
.from(this.tableName)
|
||||
.where('service_name', 'nomad_kiwix_server')
|
||||
.where('container_command', '--library /data/kiwix-library.xml --monitorLibrary --address=all')
|
||||
.update({
|
||||
container_command: '*.zim --address=all',
|
||||
})
|
||||
})
|
||||
}
|
||||
}
|
||||
|
|
@ -1,25 +0,0 @@
|
|||
import { BaseSchema } from '@adonisjs/lucid/schema'
|
||||
|
||||
export default class extends BaseSchema {
|
||||
protected tableName = 'map_markers'
|
||||
|
||||
async up() {
|
||||
this.schema.createTable(this.tableName, (table) => {
|
||||
table.increments('id')
|
||||
table.string('name').notNullable()
|
||||
table.double('longitude').notNullable()
|
||||
table.double('latitude').notNullable()
|
||||
table.string('color', 20).notNullable().defaultTo('orange')
|
||||
table.string('marker_type', 20).notNullable().defaultTo('pin')
|
||||
table.string('route_id').nullable()
|
||||
table.integer('route_order').nullable()
|
||||
table.text('notes').nullable()
|
||||
table.timestamp('created_at')
|
||||
table.timestamp('updated_at')
|
||||
})
|
||||
}
|
||||
|
||||
async down() {
|
||||
this.schema.dropTable(this.tableName)
|
||||
}
|
||||
}
|
||||
|
|
@ -3,7 +3,6 @@ import { BaseSeeder } from '@adonisjs/lucid/seeders'
|
|||
import { ModelAttributes } from '@adonisjs/lucid/types/model'
|
||||
import env from '#start/env'
|
||||
import { SERVICE_NAMES } from '../../constants/service_names.js'
|
||||
import { KIWIX_LIBRARY_CMD } from '../../constants/kiwix.js'
|
||||
|
||||
export default class ServiceSeeder extends BaseSeeder {
|
||||
// Use environment variable with fallback to production default
|
||||
|
|
@ -25,7 +24,7 @@ export default class ServiceSeeder extends BaseSeeder {
|
|||
icon: 'IconBooks',
|
||||
container_image: 'ghcr.io/kiwix/kiwix-serve:3.8.1',
|
||||
source_repo: 'https://github.com/kiwix/kiwix-tools',
|
||||
container_command: KIWIX_LIBRARY_CMD,
|
||||
container_command: '*.zim --address=all',
|
||||
container_config: JSON.stringify({
|
||||
HostConfig: {
|
||||
RestartPolicy: { Name: 'unless-stopped' },
|
||||
|
|
|
|||
|
|
@ -1,200 +0,0 @@
|
|||
# API Reference
|
||||
|
||||
N.O.M.A.D. exposes a REST API for all operations. All endpoints are under `/api/` and return JSON.
|
||||
|
||||
---
|
||||
|
||||
## Conventions
|
||||
|
||||
**Base URL:** `http://<your-server>/api`
|
||||
|
||||
**Responses:**
|
||||
- Success responses include `{ "success": true }` and an HTTP 2xx status
|
||||
- Error responses return the appropriate HTTP status (400, 404, 409, 500) with an error message
|
||||
- Long-running operations (downloads, benchmarks, embeddings) return 201 or 202 with a job/benchmark ID for polling
|
||||
|
||||
**Async pattern:** Submit a job → receive an ID → poll a status endpoint until complete.
|
||||
|
||||
---
|
||||
|
||||
## Health
|
||||
|
||||
| Method | Path | Description |
|
||||
|--------|------|-------------|
|
||||
| GET | `/api/health` | Returns `{ "status": "ok" }` |
|
||||
|
||||
---
|
||||
|
||||
## System
|
||||
|
||||
| Method | Path | Description |
|
||||
|--------|------|-------------|
|
||||
| GET | `/api/system/info` | CPU, memory, disk, and platform info |
|
||||
| GET | `/api/system/internet-status` | Check internet connectivity |
|
||||
| GET | `/api/system/debug-info` | Detailed debug information |
|
||||
| GET | `/api/system/latest-version` | Check for the latest N.O.M.A.D. version |
|
||||
| POST | `/api/system/update` | Trigger a system update |
|
||||
| GET | `/api/system/update/status` | Get update progress |
|
||||
| GET | `/api/system/update/logs` | Get update operation logs |
|
||||
| GET | `/api/system/settings` | Get a setting value (query param: `key`) |
|
||||
| PATCH | `/api/system/settings` | Update a setting (`{ key, value }`) |
|
||||
| POST | `/api/system/subscribe-release-notes` | Subscribe an email to release notes |
|
||||
|
||||
### Services
|
||||
|
||||
| Method | Path | Description |
|
||||
|--------|------|-------------|
|
||||
| GET | `/api/system/services` | List all services with status |
|
||||
| POST | `/api/system/services/install` | Install a service |
|
||||
| POST | `/api/system/services/force-reinstall` | Force reinstall a service |
|
||||
| POST | `/api/system/services/affect` | Start, stop, or restart a service (body: `{ name, action }`) |
|
||||
| POST | `/api/system/services/check-updates` | Check for available service updates |
|
||||
| POST | `/api/system/services/update` | Update a service to a specific version |
|
||||
| GET | `/api/system/services/:name/available-versions` | List available versions for a service |
|
||||
|
||||
---
|
||||
|
||||
## AI Chat
|
||||
|
||||
### Models
|
||||
|
||||
| Method | Path | Description |
|
||||
|--------|------|-------------|
|
||||
| GET | `/api/ollama/models` | List available models (supports filtering, sorting, pagination) |
|
||||
| GET | `/api/ollama/installed-models` | List locally installed models |
|
||||
| POST | `/api/ollama/models` | Download a model (async, returns job) |
|
||||
| DELETE | `/api/ollama/models` | Delete an installed model |
|
||||
|
||||
### Chat
|
||||
|
||||
| Method | Path | Description |
|
||||
|--------|------|-------------|
|
||||
| POST | `/api/ollama/chat` | Send a chat message. Supports streaming (SSE) and RAG context injection. Body: `{ model, messages, stream?, useRag? }` |
|
||||
| GET | `/api/chat/suggestions` | Get suggested chat prompts |
|
||||
|
||||
### Remote Ollama
|
||||
|
||||
| Method | Path | Description |
|
||||
|--------|------|-------------|
|
||||
| POST | `/api/ollama/configure-remote` | Configure a remote Ollama or LM Studio instance |
|
||||
| GET | `/api/ollama/remote-status` | Check remote Ollama connection status |
|
||||
|
||||
### Chat Sessions
|
||||
|
||||
| Method | Path | Description |
|
||||
|--------|------|-------------|
|
||||
| GET | `/api/chat/sessions` | List all chat sessions |
|
||||
| POST | `/api/chat/sessions` | Create a new session |
|
||||
| GET | `/api/chat/sessions/:id` | Get a session with its messages |
|
||||
| PUT | `/api/chat/sessions/:id` | Update session metadata (title, etc.) |
|
||||
| DELETE | `/api/chat/sessions/:id` | Delete a session |
|
||||
| DELETE | `/api/chat/sessions/all` | Delete all sessions |
|
||||
| POST | `/api/chat/sessions/:id/messages` | Add a message to a session |
|
||||
|
||||
**Streaming:** The `/api/ollama/chat` endpoint supports Server-Sent Events (SSE) when `stream: true` is passed. Connect using `EventSource` or `fetch` with a streaming reader.
|
||||
|
||||
---
|
||||
|
||||
## Knowledge Base (RAG)
|
||||
|
||||
Upload documents to enable AI-powered retrieval during chat.
|
||||
|
||||
| Method | Path | Description |
|
||||
|--------|------|-------------|
|
||||
| POST | `/api/rag/upload` | Upload a file for embedding (async, 202 response) |
|
||||
| GET | `/api/rag/files` | List stored RAG files |
|
||||
| DELETE | `/api/rag/files` | Delete a file (query param: `source`) |
|
||||
| GET | `/api/rag/active-jobs` | List active embedding jobs |
|
||||
| GET | `/api/rag/job-status` | Get status for a specific file embedding job |
|
||||
| GET | `/api/rag/failed-jobs` | List failed embedding jobs |
|
||||
| DELETE | `/api/rag/failed-jobs` | Clean up failed jobs and delete associated files |
|
||||
| POST | `/api/rag/sync` | Scan storage and sync database with filesystem |
|
||||
|
||||
---
|
||||
|
||||
## ZIM Files (Offline Content)
|
||||
|
||||
ZIM files provide offline Wikipedia, books, and other content via Kiwix.
|
||||
|
||||
| Method | Path | Description |
|
||||
|--------|------|-------------|
|
||||
| GET | `/api/zim/list` | List locally stored ZIM files |
|
||||
| GET | `/api/zim/list-remote` | List remote ZIM files (paginated, supports search) |
|
||||
| GET | `/api/zim/curated-categories` | List curated categories with Essential/Standard/Comprehensive tiers |
|
||||
| POST | `/api/zim/download-remote` | Download a remote ZIM file (async) |
|
||||
| POST | `/api/zim/download-category-tier` | Download a full category tier |
|
||||
| DELETE | `/api/zim/:filename` | Delete a local ZIM file |
|
||||
|
||||
### Wikipedia
|
||||
|
||||
| Method | Path | Description |
|
||||
|--------|------|-------------|
|
||||
| GET | `/api/zim/wikipedia` | Get current Wikipedia selection state |
|
||||
| POST | `/api/zim/wikipedia/select` | Select a Wikipedia edition and tier |
|
||||
|
||||
---
|
||||
|
||||
## Maps
|
||||
|
||||
| Method | Path | Description |
|
||||
|--------|------|-------------|
|
||||
| GET | `/api/maps/regions` | List available map regions |
|
||||
| GET | `/api/maps/styles` | Get map styles JSON |
|
||||
| GET | `/api/maps/curated-collections` | List curated map collections |
|
||||
| POST | `/api/maps/fetch-latest-collections` | Fetch latest collection metadata from source |
|
||||
| POST | `/api/maps/download-base-assets` | Download base map assets |
|
||||
| POST | `/api/maps/download-remote` | Download a remote map file (async) |
|
||||
| POST | `/api/maps/download-remote-preflight` | Check download size/info before starting |
|
||||
| POST | `/api/maps/download-collection` | Download an entire collection by slug (async) |
|
||||
| DELETE | `/api/maps/:filename` | Delete a local map file |
|
||||
|
||||
---
|
||||
|
||||
## Downloads
|
||||
|
||||
Manage background download jobs for maps, ZIM files, and models.
|
||||
|
||||
| Method | Path | Description |
|
||||
|--------|------|-------------|
|
||||
| GET | `/api/downloads/jobs` | List all download jobs |
|
||||
| GET | `/api/downloads/jobs/:filetype` | List jobs filtered by type (`zim`, `map`, etc.) |
|
||||
| DELETE | `/api/downloads/jobs/:jobId` | Cancel and remove a download job |
|
||||
|
||||
---
|
||||
|
||||
## Benchmarks
|
||||
|
||||
| Method | Path | Description |
|
||||
|--------|------|-------------|
|
||||
| POST | `/api/benchmark/run` | Run a benchmark (`full`, `system`, or `ai`; can be async) |
|
||||
| POST | `/api/benchmark/run/system` | Run system-only benchmark |
|
||||
| POST | `/api/benchmark/run/ai` | Run AI-only benchmark |
|
||||
| GET | `/api/benchmark/status` | Get current benchmark status (`idle` or `running`) |
|
||||
| GET | `/api/benchmark/results` | Get all benchmark results |
|
||||
| GET | `/api/benchmark/results/latest` | Get the most recent result |
|
||||
| GET | `/api/benchmark/results/:id` | Get a specific result |
|
||||
| POST | `/api/benchmark/submit` | Submit a result to the central repository |
|
||||
| POST | `/api/benchmark/builder-tag` | Update builder tag metadata for a result |
|
||||
| GET | `/api/benchmark/comparison` | Get comparison stats from the repository |
|
||||
| GET | `/api/benchmark/settings` | Get benchmark settings |
|
||||
| POST | `/api/benchmark/settings` | Update benchmark settings |
|
||||
|
||||
---
|
||||
|
||||
## Easy Setup & Content Updates
|
||||
|
||||
| Method | Path | Description |
|
||||
|--------|------|-------------|
|
||||
| GET | `/api/easy-setup/curated-categories` | List curated content categories for setup wizard |
|
||||
| POST | `/api/manifests/refresh` | Refresh manifest caches (`zim_categories`, `maps`, `wikipedia`) |
|
||||
| POST | `/api/content-updates/check` | Check for available collection updates |
|
||||
| POST | `/api/content-updates/apply` | Apply a single content update |
|
||||
| POST | `/api/content-updates/apply-all` | Apply multiple content updates |
|
||||
|
||||
---
|
||||
|
||||
## Documentation
|
||||
|
||||
| Method | Path | Description |
|
||||
|--------|------|-------------|
|
||||
| GET | `/api/docs/list` | List all available documentation files |
|
||||
|
|
@ -10,14 +10,14 @@ If this is your first time using N.O.M.A.D., the Easy Setup wizard will help you
|
|||
|
||||
**[Launch Easy Setup →](/easy-setup)**
|
||||
|
||||

|
||||

|
||||
|
||||
The wizard walks you through four simple steps:
|
||||
1. **Capabilities** — Choose what to enable: Information Library, AI Assistant, Education Platform, Maps, Data Tools, and Notes
|
||||
2. **Maps** — Select geographic regions for offline maps
|
||||
3. **Content** — Choose curated content collections with Essential, Standard, or Comprehensive tiers
|
||||
|
||||

|
||||

|
||||
4. **Review** — Confirm your selections and start downloading
|
||||
|
||||
Depending on what you selected, downloads may take a while. You can monitor progress in the Settings area, continue using features that are already installed, or leave your server running overnight for large downloads.
|
||||
|
|
@ -64,7 +64,7 @@ The Education Platform provides complete educational courses that work offline.
|
|||
|
||||
### AI Assistant — Built-in Chat
|
||||
|
||||

|
||||

|
||||
|
||||
N.O.M.A.D. includes a built-in AI chat interface powered by Ollama. It runs entirely on your server — no internet needed, no data sent anywhere.
|
||||
|
||||
|
|
@ -90,7 +90,7 @@ N.O.M.A.D. includes a built-in AI chat interface powered by Ollama. It runs enti
|
|||
|
||||
### Knowledge Base — Document-Aware AI
|
||||
|
||||

|
||||

|
||||
|
||||
The Knowledge Base lets you upload documents so the AI can reference them when answering your questions. It uses semantic search (RAG via Qdrant) to find relevant information from your uploaded files.
|
||||
|
||||
|
|
@ -115,7 +115,7 @@ The Knowledge Base lets you upload documents so the AI can reference them when a
|
|||
|
||||
### Maps — Offline Navigation
|
||||
|
||||

|
||||

|
||||
|
||||
View maps without internet. Download the regions you need before going offline.
|
||||
|
||||
|
|
@ -148,7 +148,7 @@ As your needs change, you can add more content anytime:
|
|||
|
||||
### Wikipedia Selector
|
||||
|
||||

|
||||

|
||||
|
||||
N.O.M.A.D. includes a dedicated Wikipedia content management tool for browsing and downloading Wikipedia packages.
|
||||
|
||||
|
|
@ -161,7 +161,7 @@ N.O.M.A.D. includes a dedicated Wikipedia content management tool for browsing a
|
|||
|
||||
### System Benchmark
|
||||
|
||||

|
||||

|
||||
|
||||
Test your hardware performance and see how your NOMAD build stacks up against the community.
|
||||
|
||||
|
|
|
|||
|
|
@ -8,7 +8,7 @@ Your personal offline knowledge server is ready to use.
|
|||
|
||||
Think of it as having Wikipedia, Khan Academy, an AI assistant, and offline maps all in one place, running on hardware you control.
|
||||
|
||||

|
||||

|
||||
|
||||
## What Can You Do?
|
||||
|
||||
|
|
|
|||
|
|
@ -1,44 +1,6 @@
|
|||
# Release Notes
|
||||
|
||||
## Version 1.31.0 - April 3, 2026
|
||||
|
||||
### Features
|
||||
- **AI Assistant**: Added support for remote OpenAI-compatible hosts (e.g. Ollama, LM Studio, etc.) to support running models on seperate hardware from the Command Center host. Thanks @hestela for the contribution!
|
||||
- **AI Assistant**: Disabled Ollama Cloud support (not compatible with NOMAD's architecture) and added support for flash_attn to improve performance of compatible models. Thanks @hestela for the contribution!
|
||||
- **Information Library (Kiwix)**: The Kiwix container now uses an XML library file approach instead of a glob-based approach to inform the Kiwix container of available ZIM files. This allows for much more robust handling of ZIM files and avoids issues with the container failing to start due to incomplete/corrupt ZIM files being present in the storage directory. Thanks @jakeaturner for the contribution!
|
||||
- **RAG**: Added support for EPUB file embedding into the Knowledge Base. Thanks @arn6694 for the contribution!
|
||||
- **RAG**: Added support for multiple file uploads (<=5, 100mb each) to the Knowledge Base. Thanks @jakeaturner for the contribution!
|
||||
- **Maps**: Added support for customizable location markers on the map with database persistence. Thanks @chriscrosstalk for the contribution!
|
||||
- **Maps**: The global map file can now be downloaded directly from PMTiles for users who want to the full map and/or regions outside of the U.S. that haven't been added to the curated collections yet. Thanks @bgauger for the contribution!
|
||||
- **Maps**: Added a scale bar to the map viewer with imperial and metric options. Thanks @chriscrosstalk for the contribution!
|
||||
- **Downloads**: Added support/improvements for rich progress, friendly names, cancellation, and live status updates for active downloads in the UI. Thanks @chriscrosstalk for the contribution!
|
||||
- **UI**: Converted all PNGs to WEBP for reduced image sizes and improved performance. Thanks @hestela for the contribution!
|
||||
- **UI**: Added an Installed Models section to AI Assistant settings. Thanks @chriscrosstalk for the contribution!
|
||||
|
||||
### Bug Fixes
|
||||
- **Maps**: The maps API endpoints now properly check for "X-Forwarded-Proto" to support scenarios where the Command Center is behind a reverse proxy that terminates TLS. Thanks @davidgross for the fix!
|
||||
- **Maps**: Fixed an issue where the maps API endpoints could fail with an internal error if a hostname was used to access the Command Center instead of an IP address or localhost. Thanks @jakeaturner for the fix!
|
||||
- **Queue**: Increased the BullMQ lockDuration to prevent jobs from being killed prematurely on slower systems. Thanks @bgauger for the contribution!
|
||||
- **Queue**: Added better handling for very large downloads and user-initated cancellations. Thanks @bgauger for the contribution!
|
||||
- **Install**: The install script now checks for the presence of gpg (required for NVIDIA toolkit install) and automatically attempts to install it if it's missing. Thanks @chriscrosstalk for the fix!
|
||||
- **Security**: Added key validation to the settings read API endpoint. Thanks @LuisMIguelFurlanettoSousa for the fix!
|
||||
- **Security**: Improved URL validation logic for ZIM downloads to prevent SSRF vulnerabilities. Thanks @sebastiondev for the fix!
|
||||
- **UI**: Fixed the activity feed height in Easy Setup and added automatic scrolling to the latest message during installation. Thanks @chriscrosstalk for the contribution!
|
||||
|
||||
### Improvements
|
||||
|
||||
- **Dependencies**: Updated various dependencies to close security vulnerabilities and improve stability
|
||||
- **Docker**: NOMAD now adds 'com.docker.compose.project': 'project-nomad-managed' and 'io.project-nomad.managed': 'true' labels to all containers installed via the Command Center to improve compatibility with other Docker management tools and make it easier to identify and manage NOMAD containers. Thanks @techyogi for the contribution!
|
||||
- **Docs**: Added a simple API reference for power users and developers. Thanks @hestela for the contribution!
|
||||
- **Docs**: Re-formatted the Quick Install command into multiple lines for better readability in the README. Thanks @samsara-02 for the contribution!
|
||||
- **Docs**: Updated the CONTRIBUTING and FAQ guides with the latest information and clarified some common questions. Thanks @jakeaturner for the contribution!
|
||||
- **Ops**: Bumped GitHub Actions to their latest versions. Thanks @salmanmkc for the contribution!
|
||||
- **Performance**: Shrunk the bundle size of the Command Center UI significantly by optimizing dependencies and tree-shaking, resulting in faster load times and a snappier user experience. Thanks @jakeaturner for the contribution!
|
||||
- **Performance**: Implemented gzip compression by default for all HTTP registered routes from the Command Center backend to further improve performance, especially on slower connections. The DISABLE_COMPRESSION environment variable can be used to turn off this feature if needed. Thanks @jakeaturner for the contribution!
|
||||
- **Performance**: Added light caching of certain Docker socket interactions and custom AI Assistant name resolution to improve performance and reduce redundant calls to the Docker API. Thanks @jakeaturner for the contribution!
|
||||
- **Performance**: Switched to Inertia router navigation calls where appropriate to take advantage of Inertia's built-in caching and performance optimizations for a smoother user experience. Thanks @jakeaturner for the contribution!
|
||||
|
||||
## Version 1.30.3 - March 25, 2026
|
||||
## Unreleased
|
||||
|
||||
### Features
|
||||
|
||||
|
|
|
|||
|
|
@ -1,8 +1,8 @@
|
|||
import { useRef, useState, useCallback } from 'react'
|
||||
import useDownloads, { useDownloadsProps } from '~/hooks/useDownloads'
|
||||
import { extractFileName, formatBytes } from '~/lib/util'
|
||||
import HorizontalBarChart from './HorizontalBarChart'
|
||||
import { extractFileName } from '~/lib/util'
|
||||
import StyledSectionHeader from './StyledSectionHeader'
|
||||
import { IconAlertTriangle, IconX, IconLoader2 } from '@tabler/icons-react'
|
||||
import { IconAlertTriangle, IconX } from '@tabler/icons-react'
|
||||
import api from '~/lib/api'
|
||||
|
||||
interface ActiveDownloadProps {
|
||||
|
|
@ -10,128 +10,35 @@ interface ActiveDownloadProps {
|
|||
withHeader?: boolean
|
||||
}
|
||||
|
||||
function formatSpeed(bytesPerSec: number): string {
|
||||
if (bytesPerSec <= 0) return '0 B/s'
|
||||
if (bytesPerSec < 1024) return `${Math.round(bytesPerSec)} B/s`
|
||||
if (bytesPerSec < 1024 * 1024) return `${(bytesPerSec / 1024).toFixed(1)} KB/s`
|
||||
return `${(bytesPerSec / (1024 * 1024)).toFixed(1)} MB/s`
|
||||
}
|
||||
|
||||
type DownloadStatus = 'queued' | 'active' | 'stalled' | 'failed'
|
||||
|
||||
function getDownloadStatus(download: {
|
||||
progress: number
|
||||
lastProgressTime?: number
|
||||
status?: string
|
||||
}): DownloadStatus {
|
||||
if (download.status === 'failed') return 'failed'
|
||||
if (download.status === 'waiting' || download.status === 'delayed') return 'queued'
|
||||
// Fallback heuristic for model jobs and in-flight jobs from before this deploy
|
||||
if (download.progress === 0 && !download.lastProgressTime) return 'queued'
|
||||
if (download.lastProgressTime) {
|
||||
const elapsed = Date.now() - download.lastProgressTime
|
||||
if (elapsed > 60_000) return 'stalled'
|
||||
}
|
||||
return 'active'
|
||||
}
|
||||
|
||||
const ActiveDownloads = ({ filetype, withHeader = false }: ActiveDownloadProps) => {
|
||||
const { data: downloads, invalidate } = useDownloads({ filetype })
|
||||
const [cancellingJobs, setCancellingJobs] = useState<Set<string>>(new Set())
|
||||
const [confirmingCancel, setConfirmingCancel] = useState<string | null>(null)
|
||||
|
||||
// Track previous downloadedBytes for speed calculation
|
||||
const prevBytesRef = useRef<Map<string, { bytes: number; time: number }>>(new Map())
|
||||
const speedRef = useRef<Map<string, number[]>>(new Map())
|
||||
|
||||
const getSpeed = useCallback(
|
||||
(jobId: string, currentBytes?: number): number => {
|
||||
if (!currentBytes || currentBytes <= 0) return 0
|
||||
|
||||
const prev = prevBytesRef.current.get(jobId)
|
||||
const now = Date.now()
|
||||
|
||||
if (prev && prev.bytes > 0 && currentBytes > prev.bytes) {
|
||||
const deltaBytes = currentBytes - prev.bytes
|
||||
const deltaSec = (now - prev.time) / 1000
|
||||
if (deltaSec > 0) {
|
||||
const instantSpeed = deltaBytes / deltaSec
|
||||
|
||||
// Simple moving average (last 5 samples)
|
||||
const samples = speedRef.current.get(jobId) || []
|
||||
samples.push(instantSpeed)
|
||||
if (samples.length > 5) samples.shift()
|
||||
speedRef.current.set(jobId, samples)
|
||||
|
||||
const avg = samples.reduce((a, b) => a + b, 0) / samples.length
|
||||
prevBytesRef.current.set(jobId, { bytes: currentBytes, time: now })
|
||||
return avg
|
||||
}
|
||||
}
|
||||
|
||||
// Only set initial observation; never advance timestamp when bytes unchanged
|
||||
if (!prev) {
|
||||
prevBytesRef.current.set(jobId, { bytes: currentBytes, time: now })
|
||||
}
|
||||
return speedRef.current.get(jobId)?.at(-1) || 0
|
||||
},
|
||||
[]
|
||||
)
|
||||
|
||||
const handleDismiss = async (jobId: string) => {
|
||||
await api.removeDownloadJob(jobId)
|
||||
invalidate()
|
||||
}
|
||||
|
||||
const handleCancel = async (jobId: string) => {
|
||||
setCancellingJobs((prev) => new Set(prev).add(jobId))
|
||||
setConfirmingCancel(null)
|
||||
try {
|
||||
await api.cancelDownloadJob(jobId)
|
||||
// Clean up speed tracking refs
|
||||
prevBytesRef.current.delete(jobId)
|
||||
speedRef.current.delete(jobId)
|
||||
} finally {
|
||||
setCancellingJobs((prev) => {
|
||||
const next = new Set(prev)
|
||||
next.delete(jobId)
|
||||
return next
|
||||
})
|
||||
invalidate()
|
||||
}
|
||||
}
|
||||
|
||||
return (
|
||||
<>
|
||||
{withHeader && <StyledSectionHeader title="Active Downloads" className="mt-12 mb-4" />}
|
||||
<div className="space-y-4">
|
||||
{downloads && downloads.length > 0 ? (
|
||||
downloads.map((download) => {
|
||||
const filename = extractFileName(download.filepath) || download.url
|
||||
const status = getDownloadStatus(download)
|
||||
const speed = getSpeed(download.jobId, download.downloadedBytes)
|
||||
const isCancelling = cancellingJobs.has(download.jobId)
|
||||
const isConfirming = confirmingCancel === download.jobId
|
||||
|
||||
return (
|
||||
downloads.map((download) => (
|
||||
<div
|
||||
key={download.jobId}
|
||||
className={`rounded-lg p-4 border shadow-sm hover:shadow-lg transition-shadow ${
|
||||
status === 'failed'
|
||||
? 'bg-surface-primary border-red-300'
|
||||
: 'bg-surface-primary border-default'
|
||||
className={`bg-desert-white rounded-lg p-4 border shadow-sm hover:shadow-lg transition-shadow ${
|
||||
download.status === 'failed'
|
||||
? 'border-red-300'
|
||||
: 'border-desert-stone-light'
|
||||
}`}
|
||||
>
|
||||
{status === 'failed' ? (
|
||||
{download.status === 'failed' ? (
|
||||
<div className="flex items-center gap-2">
|
||||
<IconAlertTriangle className="w-5 h-5 text-red-500 flex-shrink-0" />
|
||||
<div className="flex-1 min-w-0">
|
||||
<p className="text-sm font-medium text-text-primary truncate">
|
||||
{download.title || filename}
|
||||
<p className="text-sm font-medium text-gray-900 truncate">
|
||||
{extractFileName(download.filepath) || download.url}
|
||||
</p>
|
||||
{download.title && (
|
||||
<p className="text-xs text-text-muted truncate">{filename}</p>
|
||||
)}
|
||||
<p className="text-xs text-red-600 mt-0.5">
|
||||
Download failed{download.failedReason ? `: ${download.failedReason}` : ''}
|
||||
</p>
|
||||
|
|
@ -145,116 +52,20 @@ const ActiveDownloads = ({ filetype, withHeader = false }: ActiveDownloadProps)
|
|||
</button>
|
||||
</div>
|
||||
) : (
|
||||
<div className="space-y-2">
|
||||
{/* Title + Cancel button row */}
|
||||
<div className="flex items-start justify-between gap-2">
|
||||
<div className="flex-1 min-w-0">
|
||||
<p className="font-semibold text-desert-green truncate">
|
||||
{download.title || filename}
|
||||
</p>
|
||||
{download.title && (
|
||||
<div className="flex items-center gap-2 mt-0.5">
|
||||
<span className="text-xs text-text-muted truncate font-mono">
|
||||
{filename}
|
||||
</span>
|
||||
<span className="text-xs px-1.5 py-0.5 rounded bg-desert-stone-lighter text-desert-stone-dark font-mono flex-shrink-0">
|
||||
{download.filetype}
|
||||
</span>
|
||||
</div>
|
||||
)}
|
||||
{!download.title && download.filetype && (
|
||||
<span className="text-xs px-1.5 py-0.5 rounded bg-desert-stone-lighter text-desert-stone-dark font-mono">
|
||||
{download.filetype}
|
||||
</span>
|
||||
)}
|
||||
</div>
|
||||
{isConfirming ? (
|
||||
<div className="flex items-center gap-1 flex-shrink-0">
|
||||
<button
|
||||
onClick={() => handleCancel(download.jobId)}
|
||||
className="text-xs px-2 py-1 rounded bg-red-100 text-red-700 hover:bg-red-200 transition-colors"
|
||||
>
|
||||
Confirm
|
||||
</button>
|
||||
<button
|
||||
onClick={() => setConfirmingCancel(null)}
|
||||
className="text-xs px-2 py-1 rounded bg-desert-stone-lighter text-text-muted hover:bg-desert-stone-light transition-colors"
|
||||
>
|
||||
Keep
|
||||
</button>
|
||||
</div>
|
||||
) : isCancelling ? (
|
||||
<IconLoader2 className="w-4 h-4 text-text-muted animate-spin flex-shrink-0" />
|
||||
) : (
|
||||
<button
|
||||
onClick={() => setConfirmingCancel(download.jobId)}
|
||||
className="flex-shrink-0 p-1 rounded hover:bg-red-100 transition-colors"
|
||||
title="Cancel download"
|
||||
>
|
||||
<IconX className="w-4 h-4 text-text-muted hover:text-red-500" />
|
||||
</button>
|
||||
)}
|
||||
</div>
|
||||
|
||||
{/* Size info */}
|
||||
<div className="flex justify-between items-baseline text-sm text-text-muted font-mono">
|
||||
<span>
|
||||
{download.downloadedBytes && download.totalBytes
|
||||
? `${formatBytes(download.downloadedBytes, 1)} / ${formatBytes(download.totalBytes, 1)}`
|
||||
: `${download.progress}% / 100%`}
|
||||
</span>
|
||||
</div>
|
||||
|
||||
{/* Progress bar */}
|
||||
<div className="relative">
|
||||
<div className="h-6 bg-desert-green-lighter bg-opacity-20 rounded-lg border border-default overflow-hidden">
|
||||
<div
|
||||
className="h-full rounded-lg transition-all duration-1000 ease-out bg-desert-green"
|
||||
style={{ width: `${download.progress}%` }}
|
||||
<HorizontalBarChart
|
||||
items={[
|
||||
{
|
||||
label: extractFileName(download.filepath) || download.url,
|
||||
value: download.progress,
|
||||
total: '100%',
|
||||
used: `${download.progress}%`,
|
||||
type: download.filetype,
|
||||
},
|
||||
]}
|
||||
/>
|
||||
</div>
|
||||
<div
|
||||
className={`absolute top-1/2 -translate-y-1/2 font-bold text-xs ${
|
||||
download.progress > 15
|
||||
? 'left-2 text-white drop-shadow-md'
|
||||
: 'right-2 text-desert-green'
|
||||
}`}
|
||||
>
|
||||
{Math.round(download.progress)}%
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Status indicator */}
|
||||
<div className="flex items-center gap-2">
|
||||
{status === 'queued' && (
|
||||
<>
|
||||
<div className="w-2 h-2 rounded-full bg-desert-stone" />
|
||||
<span className="text-xs text-text-muted">Waiting...</span>
|
||||
</>
|
||||
)}
|
||||
{status === 'active' && (
|
||||
<>
|
||||
<div className="w-2 h-2 rounded-full bg-green-500 animate-pulse" />
|
||||
<span className="text-xs text-text-muted">
|
||||
Downloading...{speed > 0 ? ` ${formatSpeed(speed)}` : ''}
|
||||
</span>
|
||||
</>
|
||||
)}
|
||||
{status === 'stalled' && download.lastProgressTime && (
|
||||
<>
|
||||
<div className="w-2 h-2 rounded-full bg-orange-500 animate-pulse" />
|
||||
<span className="text-xs text-orange-600">
|
||||
No data received for{' '}
|
||||
{Math.floor((Date.now() - download.lastProgressTime) / 60_000)}m...
|
||||
</span>
|
||||
</>
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
)
|
||||
})
|
||||
))
|
||||
) : (
|
||||
<p className="text-text-muted">No active downloads</p>
|
||||
)}
|
||||
|
|
|
|||
|
|
@ -1,5 +1,6 @@
|
|||
import * as Icons from '@tabler/icons-react'
|
||||
import classNames from '~/lib/classNames'
|
||||
import DynamicIcon, { DynamicIconName } from './DynamicIcon'
|
||||
import DynamicIcon from './DynamicIcon'
|
||||
import StyledButton, { StyledButtonProps } from './StyledButton'
|
||||
|
||||
export type AlertProps = React.HTMLAttributes<HTMLDivElement> & {
|
||||
|
|
@ -9,7 +10,7 @@ export type AlertProps = React.HTMLAttributes<HTMLDivElement> & {
|
|||
children?: React.ReactNode
|
||||
dismissible?: boolean
|
||||
onDismiss?: () => void
|
||||
icon?: DynamicIconName
|
||||
icon?: keyof typeof Icons
|
||||
variant?: 'standard' | 'bordered' | 'solid'
|
||||
buttonProps?: StyledButtonProps
|
||||
}
|
||||
|
|
@ -26,7 +27,7 @@ export default function Alert({
|
|||
buttonProps,
|
||||
...props
|
||||
}: AlertProps) {
|
||||
const getDefaultIcon = (): DynamicIconName => {
|
||||
const getDefaultIcon = (): keyof typeof Icons => {
|
||||
switch (type) {
|
||||
case 'warning':
|
||||
return 'IconAlertTriangle'
|
||||
|
|
|
|||
|
|
@ -31,7 +31,7 @@ const FadingImage = ({ alt = "Fading image", className = "" }) => {
|
|||
isVisible ? 'opacity-100' : 'opacity-0'
|
||||
}`}>
|
||||
<img
|
||||
src={`/project_nomad_logo.webp`}
|
||||
src={`/project_nomad_logo.png`}
|
||||
alt={alt}
|
||||
className={`w-64 h-64 ${className}`}
|
||||
/>
|
||||
|
|
|
|||
|
|
@ -1,26 +1,36 @@
|
|||
import classNames from 'classnames'
|
||||
import { icons } from '../lib/icons'
|
||||
import * as TablerIcons from '@tabler/icons-react'
|
||||
|
||||
export type { DynamicIconName } from '../lib/icons'
|
||||
export type DynamicIconName = keyof typeof TablerIcons
|
||||
|
||||
interface DynamicIconProps {
|
||||
icon?: keyof typeof icons
|
||||
icon?: DynamicIconName
|
||||
className?: string
|
||||
stroke?: number
|
||||
onClick?: () => void
|
||||
}
|
||||
|
||||
/**
|
||||
* Renders a dynamic icon from the TablerIcons library based on the provided icon name.
|
||||
* @param icon - The name of the icon to render.
|
||||
* @param className - Optional additional CSS classes to apply to the icon.
|
||||
* @param stroke - Optional stroke width for the icon.
|
||||
* @returns A React element representing the icon, or null if no matching icon is found.
|
||||
*/
|
||||
const DynamicIcon: React.FC<DynamicIconProps> = ({ icon, className, stroke, onClick }) => {
|
||||
if (!icon) return null
|
||||
|
||||
const Icon = icons[icon]
|
||||
const Icon = TablerIcons[icon]
|
||||
|
||||
if (!Icon) {
|
||||
console.warn(`Icon "${icon}" not found in icon map.`)
|
||||
console.warn(`Icon "${icon}" not found in TablerIcons.`)
|
||||
return null
|
||||
}
|
||||
|
||||
return <Icon className={classNames('h-5 w-5', className)} strokeWidth={stroke ?? 2} onClick={onClick} />
|
||||
return (
|
||||
// @ts-ignore
|
||||
<Icon className={classNames('h-5 w-5', className)} stroke={stroke || 2} onClick={onClick} />
|
||||
)
|
||||
}
|
||||
|
||||
export default DynamicIcon
|
||||
|
|
|
|||
|
|
@ -1,4 +1,3 @@
|
|||
import { useEffect, useRef } from 'react'
|
||||
import { IconCircleCheck, IconCircleX } from '@tabler/icons-react'
|
||||
import classNames from '~/lib/classNames'
|
||||
|
||||
|
|
@ -13,30 +12,16 @@ export type InstallActivityFeedProps = {
|
|||
| 'created'
|
||||
| 'preinstall'
|
||||
| 'preinstall-complete'
|
||||
| 'preinstall-error'
|
||||
| 'starting'
|
||||
| 'started'
|
||||
| 'finalizing'
|
||||
| 'completed'
|
||||
| 'checking-dependencies'
|
||||
| 'dependency-installed'
|
||||
| 'image-exists'
|
||||
| 'gpu-config'
|
||||
| 'stopping'
|
||||
| 'removing'
|
||||
| 'recreating'
|
||||
| 'cleanup-warning'
|
||||
| 'no-volumes'
|
||||
| 'volume-removed'
|
||||
| 'volume-cleanup-warning'
|
||||
| 'error'
|
||||
| 'update-pulling'
|
||||
| 'update-stopping'
|
||||
| 'update-creating'
|
||||
| 'update-starting'
|
||||
| 'update-complete'
|
||||
| 'update-rollback'
|
||||
| (string & {})
|
||||
timestamp: string
|
||||
message: string
|
||||
}>
|
||||
|
|
@ -45,18 +30,10 @@ export type InstallActivityFeedProps = {
|
|||
}
|
||||
|
||||
const InstallActivityFeed: React.FC<InstallActivityFeedProps> = ({ activity, className, withHeader = false }) => {
|
||||
const listRef = useRef<HTMLUListElement>(null)
|
||||
|
||||
useEffect(() => {
|
||||
if (listRef.current) {
|
||||
listRef.current.scrollTop = listRef.current.scrollHeight
|
||||
}
|
||||
}, [activity])
|
||||
|
||||
return (
|
||||
<div className={classNames('bg-surface-primary shadow-sm rounded-lg p-6', className)}>
|
||||
{withHeader && <h2 className="text-lg font-semibold text-text-primary">Installation Activity</h2>}
|
||||
<ul ref={listRef} role="list" className={classNames("space-y-6 text-desert-green max-h-[400px] overflow-y-auto scroll-smooth", withHeader ? 'mt-6' : '')}>
|
||||
<ul role="list" className={classNames("space-y-6 text-desert-green", withHeader ? 'mt-6' : '')}>
|
||||
{activity.map((activityItem, activityItemIdx) => (
|
||||
<li key={activityItem.timestamp} className="relative flex gap-x-4">
|
||||
<div
|
||||
|
|
@ -71,7 +48,7 @@ const InstallActivityFeed: React.FC<InstallActivityFeedProps> = ({ activity, cla
|
|||
<div className="relative flex size-6 flex-none items-center justify-center bg-transparent">
|
||||
{activityItem.type === 'completed' || activityItem.type === 'update-complete' ? (
|
||||
<IconCircleCheck aria-hidden="true" className="size-6 text-indigo-600" />
|
||||
) : activityItem.type === 'error' || activityItem.type === 'update-rollback' || activityItem.type === 'preinstall-error' ? (
|
||||
) : activityItem.type === 'update-rollback' ? (
|
||||
<IconCircleX aria-hidden="true" className="size-6 text-red-500" />
|
||||
) : (
|
||||
<div className="size-1.5 rounded-full bg-surface-secondary ring-1 ring-border-default" />
|
||||
|
|
@ -79,7 +56,7 @@ const InstallActivityFeed: React.FC<InstallActivityFeedProps> = ({ activity, cla
|
|||
</div>
|
||||
<p className="flex-auto py-0.5 text-xs/5 text-text-muted">
|
||||
<span className="font-semibold text-text-primary">{activityItem.service_name}</span> -{' '}
|
||||
{activityItem.message || activityItem.type.charAt(0).toUpperCase() + activityItem.type.slice(1)}
|
||||
{activityItem.type.charAt(0).toUpperCase() + activityItem.type.slice(1)}
|
||||
</p>
|
||||
<time
|
||||
dateTime={activityItem.timestamp}
|
||||
|
|
|
|||
|
|
@ -2,7 +2,7 @@ import { useMemo, useState } from 'react'
|
|||
import { Dialog, DialogBackdrop, DialogPanel, TransitionChild } from '@headlessui/react'
|
||||
import classNames from '~/lib/classNames'
|
||||
import { IconArrowLeft, IconBug } from '@tabler/icons-react'
|
||||
import { Link, usePage } from '@inertiajs/react'
|
||||
import { usePage } from '@inertiajs/react'
|
||||
import { UsePageProps } from '../../types/system'
|
||||
import { IconMenu2, IconX } from '@tabler/icons-react'
|
||||
import ThemeToggle from '~/components/ThemeToggle'
|
||||
|
|
@ -32,29 +32,21 @@ const StyledSidebar: React.FC<StyledSidebarProps> = ({ title, items }) => {
|
|||
}, [])
|
||||
|
||||
const ListItem = (item: SidebarItem) => {
|
||||
const className = classNames(
|
||||
return (
|
||||
<li key={item.name}>
|
||||
<a
|
||||
href={item.href}
|
||||
target={item.target}
|
||||
className={classNames(
|
||||
item.current
|
||||
? 'bg-desert-green text-white'
|
||||
: 'text-text-primary hover:bg-desert-green-light hover:text-white',
|
||||
'group flex gap-x-3 rounded-md p-2 text-sm/6 font-semibold'
|
||||
)
|
||||
const content = (
|
||||
<>
|
||||
)}
|
||||
>
|
||||
{item.icon && <item.icon aria-hidden="true" className="size-6 shrink-0" />}
|
||||
{item.name}
|
||||
</>
|
||||
)
|
||||
return (
|
||||
<li key={item.name}>
|
||||
{item.target === '_blank' ? (
|
||||
<a href={item.href} target="_blank" rel="noopener noreferrer" className={className}>
|
||||
{content}
|
||||
</a>
|
||||
) : (
|
||||
<Link href={item.href} className={className}>
|
||||
{content}
|
||||
</Link>
|
||||
)}
|
||||
</li>
|
||||
)
|
||||
}
|
||||
|
|
@ -63,7 +55,7 @@ const StyledSidebar: React.FC<StyledSidebarProps> = ({ title, items }) => {
|
|||
return (
|
||||
<div className="flex grow flex-col gap-y-5 overflow-y-auto bg-desert-sand px-6 ring-1 ring-white/5 pt-4 shadow-md">
|
||||
<div className="flex h-16 shrink-0 items-center">
|
||||
<img src="/project_nomad_logo.webp" alt="Project Nomad Logo" className="h-16 w-16" />
|
||||
<img src="/project_nomad_logo.png" alt="Project Nomad Logo" className="h-16 w-16" />
|
||||
<h1 className="ml-3 text-xl font-semibold text-text-primary">{title}</h1>
|
||||
</div>
|
||||
<nav className="flex flex-1 flex-col">
|
||||
|
|
@ -74,13 +66,13 @@ const StyledSidebar: React.FC<StyledSidebarProps> = ({ title, items }) => {
|
|||
<ListItem key={item.name} {...item} current={currentPath === item.href} />
|
||||
))}
|
||||
<li className="ml-2 mt-4">
|
||||
<Link
|
||||
<a
|
||||
href="/home"
|
||||
className="flex flex-row items-center gap-x-3 text-desert-green text-sm font-semibold"
|
||||
>
|
||||
<IconArrowLeft aria-hidden="true" className="size-6 shrink-0" />
|
||||
Back to Home
|
||||
</Link>
|
||||
</a>
|
||||
</li>
|
||||
</ul>
|
||||
</li>
|
||||
|
|
|
|||
|
|
@ -213,7 +213,7 @@ export default function ChatInterface({
|
|||
<p className="text-text-primary">
|
||||
This will dispatch a background download job for{' '}
|
||||
<span className="font-mono font-medium">{DEFAULT_QUERY_REWRITE_MODEL}</span> and may take some time to complete. The model
|
||||
will be used to rewrite queries for improved RAG retrieval performance. Note that download is only supported when using Ollama. If using an OpenAI API interface, please download the model with that software.
|
||||
will be used to rewrite queries for improved RAG retrieval performance.
|
||||
</p>
|
||||
</StyledModal>
|
||||
</div>
|
||||
|
|
|
|||
|
|
@ -89,7 +89,7 @@ export default function ChatSidebar({
|
|||
)}
|
||||
</div>
|
||||
<div className="p-4 flex flex-col items-center justify-center gap-y-2">
|
||||
<img src="/project_nomad_logo.webp" alt="Project Nomad Logo" className="h-28 w-28 mb-6" />
|
||||
<img src="/project_nomad_logo.png" alt="Project Nomad Logo" className="h-28 w-28 mb-6" />
|
||||
<StyledButton
|
||||
onClick={() => {
|
||||
if (isInModal) {
|
||||
|
|
|
|||
|
|
@ -24,7 +24,6 @@ function sourceToDisplayName(source: string): string {
|
|||
export default function KnowledgeBaseModal({ aiAssistantName = "AI Assistant", onClose }: KnowledgeBaseModalProps) {
|
||||
const { addNotification } = useNotifications()
|
||||
const [files, setFiles] = useState<File[]>([])
|
||||
const [isUploading, setIsUploading] = useState(false)
|
||||
const [confirmDeleteSource, setConfirmDeleteSource] = useState<string | null>(null)
|
||||
const fileUploaderRef = useRef<React.ComponentRef<typeof FileUploader>>(null)
|
||||
const { openModal, closeModal } = useModals()
|
||||
|
|
@ -38,6 +37,22 @@ export default function KnowledgeBaseModal({ aiAssistantName = "AI Assistant", o
|
|||
|
||||
const uploadMutation = useMutation({
|
||||
mutationFn: (file: File) => api.uploadDocument(file),
|
||||
onSuccess: (data) => {
|
||||
addNotification({
|
||||
type: 'success',
|
||||
message: data?.message || 'Document uploaded and queued for processing',
|
||||
})
|
||||
setFiles([])
|
||||
if (fileUploaderRef.current) {
|
||||
fileUploaderRef.current.clear()
|
||||
}
|
||||
},
|
||||
onError: (error: any) => {
|
||||
addNotification({
|
||||
type: 'error',
|
||||
message: error?.message || 'Failed to upload document',
|
||||
})
|
||||
},
|
||||
})
|
||||
|
||||
const deleteMutation = useMutation({
|
||||
|
|
@ -53,17 +68,6 @@ export default function KnowledgeBaseModal({ aiAssistantName = "AI Assistant", o
|
|||
},
|
||||
})
|
||||
|
||||
const cleanupFailedMutation = useMutation({
|
||||
mutationFn: () => api.cleanupFailedEmbedJobs(),
|
||||
onSuccess: (data) => {
|
||||
addNotification({ type: 'success', message: data?.message || 'Failed jobs cleaned up.' })
|
||||
queryClient.invalidateQueries({ queryKey: ['failedEmbedJobs'] })
|
||||
},
|
||||
onError: (error: any) => {
|
||||
addNotification({ type: 'error', message: error?.message || 'Failed to clean up jobs.' })
|
||||
},
|
||||
})
|
||||
|
||||
const syncMutation = useMutation({
|
||||
mutationFn: () => api.syncRAGStorage(),
|
||||
onSuccess: (data) => {
|
||||
|
|
@ -80,34 +84,9 @@ export default function KnowledgeBaseModal({ aiAssistantName = "AI Assistant", o
|
|||
},
|
||||
})
|
||||
|
||||
const handleUpload = async () => {
|
||||
if (files.length === 0) return
|
||||
setIsUploading(true)
|
||||
let successCount = 0
|
||||
const failedNames: string[] = []
|
||||
|
||||
for (const file of files) {
|
||||
try {
|
||||
await uploadMutation.mutateAsync(file)
|
||||
successCount++
|
||||
} catch (error: any) {
|
||||
failedNames.push(file.name)
|
||||
}
|
||||
}
|
||||
|
||||
setIsUploading(false)
|
||||
setFiles([])
|
||||
fileUploaderRef.current?.clear()
|
||||
queryClient.invalidateQueries({ queryKey: ['embed-jobs'] })
|
||||
|
||||
if (successCount > 0) {
|
||||
addNotification({
|
||||
type: 'success',
|
||||
message: `${successCount} file${successCount > 1 ? 's' : ''} queued for processing.`,
|
||||
})
|
||||
}
|
||||
for (const name of failedNames) {
|
||||
addNotification({ type: 'error', message: `Failed to upload: ${name}` })
|
||||
const handleUpload = () => {
|
||||
if (files.length > 0) {
|
||||
uploadMutation.mutate(files[0])
|
||||
}
|
||||
}
|
||||
|
||||
|
|
@ -154,7 +133,7 @@ export default function KnowledgeBaseModal({ aiAssistantName = "AI Assistant", o
|
|||
<FileUploader
|
||||
ref={fileUploaderRef}
|
||||
minFiles={1}
|
||||
maxFiles={5}
|
||||
maxFiles={1}
|
||||
onUpload={(uploadedFiles) => {
|
||||
setFiles(Array.from(uploadedFiles))
|
||||
}}
|
||||
|
|
@ -165,8 +144,8 @@ export default function KnowledgeBaseModal({ aiAssistantName = "AI Assistant", o
|
|||
size="lg"
|
||||
icon="IconUpload"
|
||||
onClick={handleUpload}
|
||||
disabled={files.length === 0 || isUploading}
|
||||
loading={isUploading}
|
||||
disabled={files.length === 0 || uploadMutation.isPending}
|
||||
loading={uploadMutation.isPending}
|
||||
>
|
||||
Upload
|
||||
</StyledButton>
|
||||
|
|
@ -228,20 +207,7 @@ export default function KnowledgeBaseModal({ aiAssistantName = "AI Assistant", o
|
|||
</div>
|
||||
</div>
|
||||
<div className="my-8">
|
||||
<div className="flex items-center justify-between mb-4">
|
||||
<StyledSectionHeader title="Processing Queue" className="!mb-0" />
|
||||
<StyledButton
|
||||
variant="danger"
|
||||
size="md"
|
||||
icon="IconTrash"
|
||||
onClick={() => cleanupFailedMutation.mutate()}
|
||||
loading={cleanupFailedMutation.isPending}
|
||||
disabled={cleanupFailedMutation.isPending}
|
||||
>
|
||||
Clean Up Failed
|
||||
</StyledButton>
|
||||
</div>
|
||||
<ActiveEmbedJobs withHeader={false} />
|
||||
<ActiveEmbedJobs withHeader={true} />
|
||||
</div>
|
||||
|
||||
<div className="my-12">
|
||||
|
|
@ -252,8 +218,8 @@ export default function KnowledgeBaseModal({ aiAssistantName = "AI Assistant", o
|
|||
size="md"
|
||||
icon='IconRefresh'
|
||||
onClick={handleConfirmSync}
|
||||
disabled={syncMutation.isPending || isUploading}
|
||||
loading={syncMutation.isPending || isUploading}
|
||||
disabled={syncMutation.isPending || uploadMutation.isPending}
|
||||
loading={syncMutation.isPending || uploadMutation.isPending}
|
||||
>
|
||||
Sync Storage
|
||||
</StyledButton>
|
||||
|
|
|
|||
|
|
@ -53,14 +53,6 @@ export default function Chat({
|
|||
const activeSession = sessions.find((s) => s.id === activeSessionId)
|
||||
|
||||
const { data: lastModelSetting } = useSystemSetting({ key: 'chat.lastModel', enabled })
|
||||
const { data: remoteOllamaUrlSetting } = useSystemSetting({ key: 'ai.remoteOllamaUrl', enabled })
|
||||
|
||||
const { data: remoteStatus } = useQuery({
|
||||
queryKey: ['remoteOllamaStatus'],
|
||||
queryFn: () => api.getRemoteOllamaStatus(),
|
||||
enabled: enabled && !!remoteOllamaUrlSetting?.value,
|
||||
refetchInterval: 15000,
|
||||
})
|
||||
|
||||
const { data: installedModels = [], isLoading: isLoadingModels } = useQuery({
|
||||
queryKey: ['installedModels'],
|
||||
|
|
@ -371,18 +363,6 @@ export default function Chat({
|
|||
{activeSession?.title || 'New Chat'}
|
||||
</h2>
|
||||
<div className="flex items-center gap-4">
|
||||
{remoteOllamaUrlSetting?.value && (
|
||||
<span
|
||||
className={classNames(
|
||||
'text-xs rounded px-2 py-1 font-medium',
|
||||
remoteStatus?.connected === false
|
||||
? 'text-red-700 bg-red-50 border border-red-200'
|
||||
: 'text-green-700 bg-green-50 border border-green-200'
|
||||
)}
|
||||
>
|
||||
{remoteStatus?.connected === false ? 'Remote Disconnected' : 'Remote Connected'}
|
||||
</span>
|
||||
)}
|
||||
<div className="flex items-center gap-2">
|
||||
<label htmlFor="model-select" className="text-sm text-text-secondary">
|
||||
Model:
|
||||
|
|
@ -400,7 +380,7 @@ export default function Chat({
|
|||
>
|
||||
{installedModels.map((model) => (
|
||||
<option key={model.name} value={model.name}>
|
||||
{model.name}{model.size > 0 ? ` (${formatBytes(model.size)})` : ''}
|
||||
{model.name} ({formatBytes(model.size)})
|
||||
</option>
|
||||
))}
|
||||
</select>
|
||||
|
|
|
|||
|
|
@ -29,7 +29,7 @@ const FileUploader = forwardRef<FileUploaderRef, FileUploaderProps>((props, ref)
|
|||
const {
|
||||
minFiles = 0,
|
||||
maxFiles = 1,
|
||||
maxFileSize = 104857600, // default to 100MB
|
||||
maxFileSize = 10485760, // default to 10MB
|
||||
fileTypes,
|
||||
disabled = false,
|
||||
onUpload,
|
||||
|
|
|
|||
|
|
@ -1,41 +1,10 @@
|
|||
import Map, {
|
||||
FullscreenControl,
|
||||
NavigationControl,
|
||||
ScaleControl,
|
||||
Marker,
|
||||
Popup,
|
||||
MapProvider,
|
||||
} from 'react-map-gl/maplibre'
|
||||
import type { MapRef, MapLayerMouseEvent } from 'react-map-gl/maplibre'
|
||||
import Map, { FullscreenControl, NavigationControl, MapProvider } from 'react-map-gl/maplibre'
|
||||
import maplibregl from 'maplibre-gl'
|
||||
import 'maplibre-gl/dist/maplibre-gl.css'
|
||||
import { Protocol } from 'pmtiles'
|
||||
import { useEffect, useRef, useState, useCallback } from 'react'
|
||||
|
||||
type ScaleUnit = 'imperial' | 'metric'
|
||||
import { useMapMarkers, PIN_COLORS } from '~/hooks/useMapMarkers'
|
||||
import type { PinColorId } from '~/hooks/useMapMarkers'
|
||||
import MarkerPin from './MarkerPin'
|
||||
import MarkerPanel from './MarkerPanel'
|
||||
import { useEffect } from 'react'
|
||||
|
||||
export default function MapComponent() {
|
||||
const mapRef = useRef<MapRef>(null)
|
||||
const { markers, addMarker, deleteMarker } = useMapMarkers()
|
||||
const [placingMarker, setPlacingMarker] = useState<{ lng: number; lat: number } | null>(null)
|
||||
const [markerName, setMarkerName] = useState('')
|
||||
const [markerColor, setMarkerColor] = useState<PinColorId>('orange')
|
||||
const [selectedMarkerId, setSelectedMarkerId] = useState<number | null>(null)
|
||||
const [scaleUnit, setScaleUnit] = useState<ScaleUnit>(
|
||||
() => (localStorage.getItem('nomad:map-scale-unit') as ScaleUnit) || 'metric'
|
||||
)
|
||||
|
||||
const toggleScaleUnit = useCallback(() => {
|
||||
setScaleUnit((prev) => {
|
||||
const next = prev === 'metric' ? 'imperial' : 'metric'
|
||||
localStorage.setItem('nomad:map-scale-unit', next)
|
||||
return next
|
||||
})
|
||||
}, [])
|
||||
|
||||
// Add the PMTiles protocol to maplibre-gl
|
||||
useEffect(() => {
|
||||
|
|
@ -46,40 +15,9 @@ export default function MapComponent() {
|
|||
}
|
||||
}, [])
|
||||
|
||||
const handleMapClick = useCallback((e: MapLayerMouseEvent) => {
|
||||
setPlacingMarker({ lng: e.lngLat.lng, lat: e.lngLat.lat })
|
||||
setMarkerName('')
|
||||
setMarkerColor('orange')
|
||||
setSelectedMarkerId(null)
|
||||
}, [])
|
||||
|
||||
const handleSaveMarker = useCallback(() => {
|
||||
if (placingMarker && markerName.trim()) {
|
||||
addMarker(markerName.trim(), placingMarker.lng, placingMarker.lat, markerColor)
|
||||
setPlacingMarker(null)
|
||||
setMarkerName('')
|
||||
setMarkerColor('orange')
|
||||
}
|
||||
}, [placingMarker, markerName, markerColor, addMarker])
|
||||
|
||||
const handleFlyTo = useCallback((longitude: number, latitude: number) => {
|
||||
mapRef.current?.flyTo({ center: [longitude, latitude], zoom: 12, duration: 1500 })
|
||||
}, [])
|
||||
|
||||
const handleDeleteMarker = useCallback(
|
||||
(id: number) => {
|
||||
if (selectedMarkerId === id) setSelectedMarkerId(null)
|
||||
deleteMarker(id)
|
||||
},
|
||||
[selectedMarkerId, deleteMarker]
|
||||
)
|
||||
|
||||
const selectedMarker = selectedMarkerId ? markers.find((m) => m.id === selectedMarkerId) : null
|
||||
|
||||
return (
|
||||
<MapProvider>
|
||||
<Map
|
||||
ref={mapRef}
|
||||
reuseMaps
|
||||
style={{
|
||||
width: '100%',
|
||||
|
|
@ -92,153 +30,10 @@ export default function MapComponent() {
|
|||
latitude: 40,
|
||||
zoom: 3.5,
|
||||
}}
|
||||
onClick={handleMapClick}
|
||||
>
|
||||
<NavigationControl style={{ marginTop: '110px', marginRight: '36px' }} />
|
||||
<FullscreenControl style={{ marginTop: '30px', marginRight: '36px' }} />
|
||||
<ScaleControl position="bottom-left" maxWidth={150} unit={scaleUnit} />
|
||||
<div style={{ position: 'absolute', bottom: '30px', left: '10px', zIndex: 2 }}>
|
||||
<div
|
||||
style={{
|
||||
display: 'inline-flex',
|
||||
borderRadius: '4px',
|
||||
boxShadow: '0 0 0 2px rgba(0,0,0,0.1)',
|
||||
overflow: 'hidden',
|
||||
fontSize: '11px',
|
||||
fontWeight: 600,
|
||||
lineHeight: 1,
|
||||
}}
|
||||
>
|
||||
<button
|
||||
onClick={() => { if (scaleUnit !== 'metric') toggleScaleUnit() }}
|
||||
style={{
|
||||
background: scaleUnit === 'metric' ? '#424420' : 'white',
|
||||
color: scaleUnit === 'metric' ? 'white' : '#666',
|
||||
border: 'none',
|
||||
padding: '4px 8px',
|
||||
cursor: 'pointer',
|
||||
}}
|
||||
>
|
||||
Metric
|
||||
</button>
|
||||
<button
|
||||
onClick={() => { if (scaleUnit !== 'imperial') toggleScaleUnit() }}
|
||||
style={{
|
||||
background: scaleUnit === 'imperial' ? '#424420' : 'white',
|
||||
color: scaleUnit === 'imperial' ? 'white' : '#666',
|
||||
border: 'none',
|
||||
padding: '4px 8px',
|
||||
cursor: 'pointer',
|
||||
}}
|
||||
>
|
||||
Imperial
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Existing markers */}
|
||||
{markers.map((marker) => (
|
||||
<Marker
|
||||
key={marker.id}
|
||||
longitude={marker.longitude}
|
||||
latitude={marker.latitude}
|
||||
anchor="bottom"
|
||||
onClick={(e) => {
|
||||
e.originalEvent.stopPropagation()
|
||||
setSelectedMarkerId(marker.id === selectedMarkerId ? null : marker.id)
|
||||
setPlacingMarker(null)
|
||||
}}
|
||||
>
|
||||
<MarkerPin
|
||||
color={PIN_COLORS.find((c) => c.id === marker.color)?.hex}
|
||||
active={marker.id === selectedMarkerId}
|
||||
/>
|
||||
</Marker>
|
||||
))}
|
||||
|
||||
{/* Popup for selected marker */}
|
||||
{selectedMarker && (
|
||||
<Popup
|
||||
longitude={selectedMarker.longitude}
|
||||
latitude={selectedMarker.latitude}
|
||||
anchor="bottom"
|
||||
offset={[0, -36] as [number, number]}
|
||||
onClose={() => setSelectedMarkerId(null)}
|
||||
closeOnClick={false}
|
||||
>
|
||||
<div className="text-sm font-medium">{selectedMarker.name}</div>
|
||||
</Popup>
|
||||
)}
|
||||
|
||||
{/* Popup for placing a new marker */}
|
||||
{placingMarker && (
|
||||
<Popup
|
||||
longitude={placingMarker.lng}
|
||||
latitude={placingMarker.lat}
|
||||
anchor="bottom"
|
||||
onClose={() => setPlacingMarker(null)}
|
||||
closeOnClick={false}
|
||||
>
|
||||
<div className="p-1">
|
||||
<input
|
||||
autoFocus
|
||||
type="text"
|
||||
placeholder="Name this location"
|
||||
value={markerName}
|
||||
onChange={(e) => setMarkerName(e.target.value)}
|
||||
onKeyDown={(e) => {
|
||||
if (e.key === 'Enter') handleSaveMarker()
|
||||
if (e.key === 'Escape') setPlacingMarker(null)
|
||||
}}
|
||||
className="block w-full rounded border border-gray-300 px-2 py-1 text-sm placeholder:text-gray-400 focus:outline-none focus:border-gray-500"
|
||||
/>
|
||||
<div className="mt-1.5 flex gap-1 items-center">
|
||||
{PIN_COLORS.map((c) => (
|
||||
<button
|
||||
key={c.id}
|
||||
onClick={() => setMarkerColor(c.id)}
|
||||
title={c.label}
|
||||
className="rounded-full p-0.5 transition-transform"
|
||||
style={{
|
||||
outline: markerColor === c.id ? `2px solid ${c.hex}` : '2px solid transparent',
|
||||
outlineOffset: '1px',
|
||||
}}
|
||||
>
|
||||
<div
|
||||
className="w-4 h-4 rounded-full"
|
||||
style={{ backgroundColor: c.hex }}
|
||||
/>
|
||||
</button>
|
||||
))}
|
||||
</div>
|
||||
<div className="mt-1.5 flex gap-1.5 justify-end">
|
||||
<button
|
||||
onClick={() => setPlacingMarker(null)}
|
||||
className="text-xs text-gray-500 hover:text-gray-700 px-2 py-1 rounded transition-colors"
|
||||
>
|
||||
Cancel
|
||||
</button>
|
||||
<button
|
||||
onClick={handleSaveMarker}
|
||||
disabled={!markerName.trim()}
|
||||
className="text-xs bg-[#424420] text-white rounded px-2.5 py-1 hover:bg-[#525530] disabled:opacity-40 transition-colors"
|
||||
>
|
||||
Save
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
</Popup>
|
||||
)}
|
||||
</Map>
|
||||
|
||||
{/* Marker panel overlay */}
|
||||
<MarkerPanel
|
||||
markers={markers}
|
||||
onDelete={handleDeleteMarker}
|
||||
onFlyTo={handleFlyTo}
|
||||
onSelect={setSelectedMarkerId}
|
||||
selectedMarkerId={selectedMarkerId}
|
||||
/>
|
||||
</MapProvider>
|
||||
)
|
||||
}
|
||||
|
|
|
|||
|
|
@ -1,116 +0,0 @@
|
|||
import { useState } from 'react'
|
||||
import { IconMapPinFilled, IconTrash, IconMapPin, IconX } from '@tabler/icons-react'
|
||||
import { PIN_COLORS } from '~/hooks/useMapMarkers'
|
||||
import type { MapMarker } from '~/hooks/useMapMarkers'
|
||||
|
||||
interface MarkerPanelProps {
|
||||
markers: MapMarker[]
|
||||
onDelete: (id: number) => void
|
||||
onFlyTo: (longitude: number, latitude: number) => void
|
||||
onSelect: (id: number | null) => void
|
||||
selectedMarkerId: number | null
|
||||
}
|
||||
|
||||
export default function MarkerPanel({
|
||||
markers,
|
||||
onDelete,
|
||||
onFlyTo,
|
||||
onSelect,
|
||||
selectedMarkerId,
|
||||
}: MarkerPanelProps) {
|
||||
const [open, setOpen] = useState(false)
|
||||
|
||||
if (!open) {
|
||||
return (
|
||||
<button
|
||||
onClick={() => setOpen(true)}
|
||||
className="absolute left-4 top-[72px] z-40 flex items-center gap-1.5 rounded-lg bg-surface-primary/95 px-3 py-2 shadow-lg border border-border-subtle backdrop-blur-sm hover:bg-surface-secondary transition-colors"
|
||||
title="Show saved locations"
|
||||
>
|
||||
<IconMapPin size={18} className="text-desert-orange" />
|
||||
<span className="text-sm font-medium text-text-primary">Pins</span>
|
||||
{markers.length > 0 && (
|
||||
<span className="ml-0.5 flex h-5 min-w-5 items-center justify-center rounded-full bg-desert-orange text-[11px] font-bold text-white px-1">
|
||||
{markers.length}
|
||||
</span>
|
||||
)}
|
||||
</button>
|
||||
)
|
||||
}
|
||||
|
||||
return (
|
||||
<div className="absolute left-4 top-[72px] z-40 w-72 rounded-lg bg-surface-primary/95 shadow-lg border border-border-subtle backdrop-blur-sm">
|
||||
{/* Header */}
|
||||
<div className="flex items-center justify-between px-3 py-2.5 border-b border-border-subtle">
|
||||
<div className="flex items-center gap-2">
|
||||
<IconMapPin size={18} className="text-desert-orange" />
|
||||
<span className="text-sm font-semibold text-text-primary">
|
||||
Saved Locations
|
||||
</span>
|
||||
{markers.length > 0 && (
|
||||
<span className="flex h-5 min-w-5 items-center justify-center rounded-full bg-desert-orange text-[11px] font-bold text-white px-1">
|
||||
{markers.length}
|
||||
</span>
|
||||
)}
|
||||
</div>
|
||||
<button
|
||||
onClick={() => setOpen(false)}
|
||||
className="rounded p-0.5 text-text-muted hover:text-text-primary hover:bg-surface-secondary transition-colors"
|
||||
title="Close panel"
|
||||
>
|
||||
<IconX size={16} />
|
||||
</button>
|
||||
</div>
|
||||
|
||||
{/* Marker list */}
|
||||
<div className="max-h-[calc(100vh-180px)] overflow-y-auto">
|
||||
{markers.length === 0 ? (
|
||||
<div className="px-3 py-6 text-center">
|
||||
<IconMapPinFilled size={24} className="mx-auto mb-2 text-text-muted" />
|
||||
<p className="text-sm text-text-muted">
|
||||
Click anywhere on the map to drop a pin
|
||||
</p>
|
||||
</div>
|
||||
) : (
|
||||
<ul>
|
||||
{markers.map((marker) => (
|
||||
<li
|
||||
key={marker.id}
|
||||
className={`flex items-center gap-2 px-3 py-2 border-b border-border-subtle last:border-b-0 group transition-colors ${
|
||||
marker.id === selectedMarkerId
|
||||
? 'bg-desert-green/10'
|
||||
: 'hover:bg-surface-secondary'
|
||||
}`}
|
||||
>
|
||||
<IconMapPinFilled
|
||||
size={16}
|
||||
className="shrink-0"
|
||||
style={{ color: PIN_COLORS.find((c) => c.id === marker.color)?.hex ?? '#a84a12' }}
|
||||
/>
|
||||
<button
|
||||
onClick={() => {
|
||||
onSelect(marker.id)
|
||||
onFlyTo(marker.longitude, marker.latitude)
|
||||
}}
|
||||
className="flex-1 min-w-0 text-left"
|
||||
title={marker.name}
|
||||
>
|
||||
<p className="text-sm font-medium text-text-primary truncate">
|
||||
{marker.name}
|
||||
</p>
|
||||
</button>
|
||||
<button
|
||||
onClick={() => onDelete(marker.id)}
|
||||
className="shrink-0 rounded p-1 text-text-muted opacity-0 group-hover:opacity-100 hover:text-desert-red hover:bg-surface-secondary transition-all"
|
||||
title="Delete pin"
|
||||
>
|
||||
<IconTrash size={14} />
|
||||
</button>
|
||||
</li>
|
||||
))}
|
||||
</ul>
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
)
|
||||
}
|
||||
|
|
@ -1,17 +0,0 @@
|
|||
import { IconMapPinFilled } from '@tabler/icons-react'
|
||||
|
||||
interface MarkerPinProps {
|
||||
color?: string
|
||||
active?: boolean
|
||||
}
|
||||
|
||||
export default function MarkerPin({ color = '#a84a12', active = false }: MarkerPinProps) {
|
||||
return (
|
||||
<div className="cursor-pointer" style={{ filter: 'drop-shadow(0 1px 2px rgba(0,0,0,0.4))' }}>
|
||||
<IconMapPinFilled
|
||||
size={active ? 36 : 32}
|
||||
style={{ color }}
|
||||
/>
|
||||
</div>
|
||||
)
|
||||
}
|
||||
|
|
@ -119,40 +119,3 @@ body {
|
|||
|
||||
color-scheme: dark;
|
||||
}
|
||||
|
||||
/* MapLibre popup styling for dark mode */
|
||||
[data-theme="dark"] .maplibregl-popup-content {
|
||||
background: #2a2918;
|
||||
color: #f7eedc;
|
||||
}
|
||||
|
||||
[data-theme="dark"] .maplibregl-popup-content input {
|
||||
background: #353420;
|
||||
color: #f7eedc;
|
||||
border-color: #424420;
|
||||
}
|
||||
|
||||
[data-theme="dark"] .maplibregl-popup-content input::placeholder {
|
||||
color: #8f8f82;
|
||||
}
|
||||
|
||||
[data-theme="dark"] .maplibregl-popup-tip {
|
||||
border-top-color: #2a2918;
|
||||
}
|
||||
|
||||
[data-theme="dark"] .maplibregl-popup-anchor-bottom .maplibregl-popup-tip {
|
||||
border-top-color: #2a2918;
|
||||
}
|
||||
|
||||
[data-theme="dark"] .maplibregl-popup-anchor-top .maplibregl-popup-tip {
|
||||
border-bottom-color: #2a2918;
|
||||
}
|
||||
|
||||
[data-theme="dark"] .maplibregl-popup-close-button {
|
||||
color: #afafa5;
|
||||
}
|
||||
|
||||
[data-theme="dark"] .maplibregl-popup-close-button:hover {
|
||||
color: #f7eedc;
|
||||
background: #353420;
|
||||
}
|
||||
|
|
@ -1,10 +1,8 @@
|
|||
import { useEffect, useRef } from 'react'
|
||||
import { useQuery, useQueryClient } from '@tanstack/react-query'
|
||||
import api from '~/lib/api'
|
||||
|
||||
const useEmbedJobs = (props: { enabled?: boolean } = {}) => {
|
||||
const queryClient = useQueryClient()
|
||||
const prevCountRef = useRef<number>(0)
|
||||
|
||||
const queryData = useQuery({
|
||||
queryKey: ['embed-jobs'],
|
||||
|
|
@ -17,15 +15,6 @@ const useEmbedJobs = (props: { enabled?: boolean } = {}) => {
|
|||
enabled: props.enabled ?? true,
|
||||
})
|
||||
|
||||
// When jobs drain to zero, refresh stored files so they appear without reopening the modal
|
||||
useEffect(() => {
|
||||
const currentCount = queryData.data?.length ?? 0
|
||||
if (prevCountRef.current > 0 && currentCount === 0) {
|
||||
queryClient.invalidateQueries({ queryKey: ['storedFiles'] })
|
||||
}
|
||||
prevCountRef.current = currentCount
|
||||
}, [queryData.data, queryClient])
|
||||
|
||||
const invalidate = () => {
|
||||
queryClient.invalidateQueries({ queryKey: ['embed-jobs'] })
|
||||
}
|
||||
|
|
|
|||
|
|
@ -1,86 +0,0 @@
|
|||
import { useState, useCallback, useEffect } from 'react'
|
||||
import api from '~/lib/api'
|
||||
|
||||
export const PIN_COLORS = [
|
||||
{ id: 'orange', label: 'Orange', hex: '#a84a12' },
|
||||
{ id: 'red', label: 'Red', hex: '#994444' },
|
||||
{ id: 'green', label: 'Green', hex: '#424420' },
|
||||
{ id: 'blue', label: 'Blue', hex: '#2563eb' },
|
||||
{ id: 'purple', label: 'Purple', hex: '#7c3aed' },
|
||||
{ id: 'yellow', label: 'Yellow', hex: '#ca8a04' },
|
||||
] as const
|
||||
|
||||
export type PinColorId = typeof PIN_COLORS[number]['id']
|
||||
|
||||
export interface MapMarker {
|
||||
id: number
|
||||
name: string
|
||||
longitude: number
|
||||
latitude: number
|
||||
color: PinColorId
|
||||
createdAt: string
|
||||
}
|
||||
|
||||
export function useMapMarkers() {
|
||||
const [markers, setMarkers] = useState<MapMarker[]>([])
|
||||
const [loaded, setLoaded] = useState(false)
|
||||
|
||||
// Load markers from API on mount
|
||||
useEffect(() => {
|
||||
api.listMapMarkers().then((data) => {
|
||||
if (data) {
|
||||
setMarkers(
|
||||
data.map((m) => ({
|
||||
id: m.id,
|
||||
name: m.name,
|
||||
longitude: m.longitude,
|
||||
latitude: m.latitude,
|
||||
color: m.color as PinColorId,
|
||||
createdAt: m.created_at,
|
||||
}))
|
||||
)
|
||||
}
|
||||
setLoaded(true)
|
||||
})
|
||||
}, [])
|
||||
|
||||
const addMarker = useCallback(
|
||||
async (name: string, longitude: number, latitude: number, color: PinColorId = 'orange') => {
|
||||
const result = await api.createMapMarker({ name, longitude, latitude, color })
|
||||
if (result) {
|
||||
const marker: MapMarker = {
|
||||
id: result.id,
|
||||
name: result.name,
|
||||
longitude: result.longitude,
|
||||
latitude: result.latitude,
|
||||
color: result.color as PinColorId,
|
||||
createdAt: result.created_at,
|
||||
}
|
||||
setMarkers((prev) => [...prev, marker])
|
||||
return marker
|
||||
}
|
||||
return null
|
||||
},
|
||||
[]
|
||||
)
|
||||
|
||||
const updateMarker = useCallback(async (id: number, updates: { name?: string; color?: string }) => {
|
||||
const result = await api.updateMapMarker(id, updates)
|
||||
if (result) {
|
||||
setMarkers((prev) =>
|
||||
prev.map((m) =>
|
||||
m.id === id
|
||||
? { ...m, name: result.name, color: result.color as PinColorId }
|
||||
: m
|
||||
)
|
||||
)
|
||||
}
|
||||
}, [])
|
||||
|
||||
const deleteMarker = useCallback(async (id: number) => {
|
||||
await api.deleteMapMarker(id)
|
||||
setMarkers((prev) => prev.filter((m) => m.id !== id))
|
||||
}, [])
|
||||
|
||||
return { markers, loaded, addMarker, updateMarker, deleteMarker }
|
||||
}
|
||||
|
|
@ -4,7 +4,7 @@ import ChatButton from '~/components/chat/ChatButton'
|
|||
import ChatModal from '~/components/chat/ChatModal'
|
||||
import useServiceInstalledStatus from '~/hooks/useServiceInstalledStatus'
|
||||
import { SERVICE_NAMES } from '../../constants/service_names'
|
||||
import { Link, router } from '@inertiajs/react'
|
||||
import { Link } from '@inertiajs/react'
|
||||
import { IconArrowLeft } from '@tabler/icons-react'
|
||||
import classNames from 'classnames'
|
||||
|
||||
|
|
@ -23,9 +23,9 @@ export default function AppLayout({ children }: { children: React.ReactNode }) {
|
|||
)}
|
||||
<div
|
||||
className="p-2 flex gap-2 flex-col items-center justify-center cursor-pointer"
|
||||
onClick={() => router.visit('/home')}
|
||||
onClick={() => (window.location.href = '/home')}
|
||||
>
|
||||
<img src="/project_nomad_logo.webp" alt="Project Nomad Logo" className="h-40 w-40" />
|
||||
<img src="/project_nomad_logo.png" alt="Project Nomad Logo" className="h-40 w-40" />
|
||||
<h1 className="text-5xl font-bold text-desert-green">Command Center</h1>
|
||||
</div>
|
||||
<hr className={
|
||||
|
|
|
|||
|
|
@ -7,7 +7,8 @@ import { DownloadJobWithProgress, WikipediaState } from '../../types/downloads'
|
|||
import { EmbedJobWithProgress } from '../../types/rag'
|
||||
import type { CategoryWithStatus, CollectionWithStatus, ContentUpdateCheckResult, ResourceUpdateInfo } from '../../types/collections'
|
||||
import { catchInternal } from './util'
|
||||
import { NomadChatResponse, NomadInstalledModel, NomadOllamaModel, OllamaChatRequest } from '../../types/ollama'
|
||||
import { NomadOllamaModel, OllamaChatRequest } from '../../types/ollama'
|
||||
import { ChatResponse, ModelResponse } from 'ollama'
|
||||
import BenchmarkResult from '#models/benchmark_result'
|
||||
import { BenchmarkType, RunBenchmarkResponse, SubmitBenchmarkResponse, UpdateBuilderTagResponse } from '../../types/benchmark'
|
||||
|
||||
|
|
@ -48,25 +49,6 @@ class API {
|
|||
})()
|
||||
}
|
||||
|
||||
async getRemoteOllamaStatus(): Promise<{ configured: boolean; connected: boolean }> {
|
||||
return catchInternal(async () => {
|
||||
const response = await this.client.get<{ configured: boolean; connected: boolean }>(
|
||||
'/ollama/remote-status'
|
||||
)
|
||||
return response.data
|
||||
})()
|
||||
}
|
||||
|
||||
async configureRemoteOllama(remoteUrl: string | null): Promise<{ success: boolean; message: string }> {
|
||||
return catchInternal(async () => {
|
||||
const response = await this.client.post<{ success: boolean; message: string }>(
|
||||
'/ollama/configure-remote',
|
||||
{ remoteUrl }
|
||||
)
|
||||
return response.data
|
||||
})()
|
||||
}
|
||||
|
||||
async deleteModel(model: string): Promise<{ success: boolean; message: string }> {
|
||||
return catchInternal(async () => {
|
||||
const response = await this.client.delete('/ollama/models', { data: { model } })
|
||||
|
|
@ -257,7 +239,7 @@ class API {
|
|||
|
||||
async getInstalledModels() {
|
||||
return catchInternal(async () => {
|
||||
const response = await this.client.get<NomadInstalledModel[]>('/ollama/installed-models')
|
||||
const response = await this.client.get<ModelResponse[]>('/ollama/installed-models')
|
||||
return response.data
|
||||
})()
|
||||
}
|
||||
|
|
@ -276,7 +258,7 @@ class API {
|
|||
|
||||
async sendChatMessage(chatRequest: OllamaChatRequest) {
|
||||
return catchInternal(async () => {
|
||||
const response = await this.client.post<NomadChatResponse>('/ollama/chat', chatRequest)
|
||||
const response = await this.client.post<ChatResponse>('/ollama/chat', chatRequest)
|
||||
return response.data
|
||||
})()
|
||||
}
|
||||
|
|
@ -437,20 +419,6 @@ class API {
|
|||
})()
|
||||
}
|
||||
|
||||
async getFailedEmbedJobs(): Promise<EmbedJobWithProgress[] | undefined> {
|
||||
return catchInternal(async () => {
|
||||
const response = await this.client.get<EmbedJobWithProgress[]>('/rag/failed-jobs')
|
||||
return response.data
|
||||
})()
|
||||
}
|
||||
|
||||
async cleanupFailedEmbedJobs(): Promise<{ message: string; cleaned: number; filesDeleted: number } | undefined> {
|
||||
return catchInternal(async () => {
|
||||
const response = await this.client.delete<{ message: string; cleaned: number; filesDeleted: number }>('/rag/failed-jobs')
|
||||
return response.data
|
||||
})()
|
||||
}
|
||||
|
||||
async getStoredRAGFiles() {
|
||||
return catchInternal(async () => {
|
||||
const response = await this.client.get<{ files: string[] }>('/rag/files')
|
||||
|
|
@ -518,29 +486,6 @@ class API {
|
|||
}
|
||||
}
|
||||
|
||||
async getGlobalMapInfo() {
|
||||
return catchInternal(async () => {
|
||||
const response = await this.client.get<{
|
||||
url: string
|
||||
date: string
|
||||
size: number
|
||||
key: string
|
||||
}>('/maps/global-map-info')
|
||||
return response.data
|
||||
})()
|
||||
}
|
||||
|
||||
async downloadGlobalMap() {
|
||||
return catchInternal(async () => {
|
||||
const response = await this.client.post<{
|
||||
message: string
|
||||
filename: string
|
||||
jobId?: string
|
||||
}>('/maps/download-global-map')
|
||||
return response.data
|
||||
})()
|
||||
}
|
||||
|
||||
async listCuratedMapCollections() {
|
||||
return catchInternal(async () => {
|
||||
const response = await this.client.get<CollectionWithStatus[]>(
|
||||
|
|
@ -571,39 +516,6 @@ class API {
|
|||
})()
|
||||
}
|
||||
|
||||
async listMapMarkers() {
|
||||
return catchInternal(async () => {
|
||||
const response = await this.client.get<
|
||||
Array<{ id: number; name: string; longitude: number; latitude: number; color: string; created_at: string }>
|
||||
>('/maps/markers')
|
||||
return response.data
|
||||
})()
|
||||
}
|
||||
|
||||
async createMapMarker(data: { name: string; longitude: number; latitude: number; color?: string }) {
|
||||
return catchInternal(async () => {
|
||||
const response = await this.client.post<
|
||||
{ id: number; name: string; longitude: number; latitude: number; color: string; created_at: string }
|
||||
>('/maps/markers', data)
|
||||
return response.data
|
||||
})()
|
||||
}
|
||||
|
||||
async updateMapMarker(id: number, data: { name?: string; color?: string }) {
|
||||
return catchInternal(async () => {
|
||||
const response = await this.client.patch<
|
||||
{ id: number; name: string; longitude: number; latitude: number; color: string }
|
||||
>(`/maps/markers/${id}`, data)
|
||||
return response.data
|
||||
})()
|
||||
}
|
||||
|
||||
async deleteMapMarker(id: number) {
|
||||
return catchInternal(async () => {
|
||||
await this.client.delete(`/maps/markers/${id}`)
|
||||
})()
|
||||
}
|
||||
|
||||
async listRemoteZimFiles({
|
||||
start = 0,
|
||||
count = 12,
|
||||
|
|
@ -651,15 +563,6 @@ class API {
|
|||
})()
|
||||
}
|
||||
|
||||
async cancelDownloadJob(jobId: string): Promise<{ success: boolean; message: string } | undefined> {
|
||||
return catchInternal(async () => {
|
||||
const response = await this.client.post<{ success: boolean; message: string }>(
|
||||
`/downloads/jobs/${jobId}/cancel`
|
||||
)
|
||||
return response.data
|
||||
})()
|
||||
}
|
||||
|
||||
async runBenchmark(type: BenchmarkType, sync: boolean = false) {
|
||||
return catchInternal(async () => {
|
||||
const response = await this.client.post<RunBenchmarkResponse>(
|
||||
|
|
|
|||
|
|
@ -1,104 +0,0 @@
|
|||
import {
|
||||
IconArrowUp,
|
||||
IconBooks,
|
||||
IconBrain,
|
||||
IconChefHat,
|
||||
IconCheck,
|
||||
IconChevronLeft,
|
||||
IconChevronRight,
|
||||
IconCloudDownload,
|
||||
IconCloudUpload,
|
||||
IconCpu,
|
||||
IconDatabase,
|
||||
IconDownload,
|
||||
IconHome,
|
||||
IconLogs,
|
||||
IconNotes,
|
||||
IconPlayerPlay,
|
||||
IconPlus,
|
||||
IconRefresh,
|
||||
IconRefreshAlert,
|
||||
IconRobot,
|
||||
IconSchool,
|
||||
IconSettings,
|
||||
IconTrash,
|
||||
IconUpload,
|
||||
IconWand,
|
||||
IconWorld,
|
||||
IconX,
|
||||
IconAlertTriangle,
|
||||
IconXboxX,
|
||||
IconCircleCheck,
|
||||
IconInfoCircle,
|
||||
IconBug,
|
||||
IconCopy,
|
||||
IconServer,
|
||||
IconMenu2,
|
||||
IconArrowLeft,
|
||||
IconArrowRight,
|
||||
IconSun,
|
||||
IconMoon,
|
||||
IconStethoscope,
|
||||
IconShieldCheck,
|
||||
IconTool,
|
||||
IconPlant,
|
||||
IconCode,
|
||||
IconMap,
|
||||
} from '@tabler/icons-react'
|
||||
|
||||
/**
|
||||
* An explicit import of used icons in the DynamicIcon component to ensure we get maximum tree-shaking
|
||||
* while still providing us a nice DX with the DynamicIcon component and icon name inference.
|
||||
* Only icons that are actually used by DynamicIcon should be added here. Yes, it does introduce
|
||||
* some manual maintenance, but the bundle size benefits are worth it since we use a (relatively)
|
||||
* very limited subset of the full Tabler Icons library.
|
||||
*/
|
||||
export const icons = {
|
||||
IconAlertTriangle,
|
||||
IconArrowLeft,
|
||||
IconArrowRight,
|
||||
IconArrowUp,
|
||||
IconBooks,
|
||||
IconBrain,
|
||||
IconBug,
|
||||
IconChefHat,
|
||||
IconCheck,
|
||||
IconChevronLeft,
|
||||
IconChevronRight,
|
||||
IconCircleCheck,
|
||||
IconCloudDownload,
|
||||
IconCloudUpload,
|
||||
IconCode,
|
||||
IconCopy,
|
||||
IconCpu,
|
||||
IconDatabase,
|
||||
IconDownload,
|
||||
IconHome,
|
||||
IconInfoCircle,
|
||||
IconLogs,
|
||||
IconMap,
|
||||
IconMenu2,
|
||||
IconMoon,
|
||||
IconNotes,
|
||||
IconPlant,
|
||||
IconPlayerPlay,
|
||||
IconPlus,
|
||||
IconRefresh,
|
||||
IconRefreshAlert,
|
||||
IconRobot,
|
||||
IconSchool,
|
||||
IconServer,
|
||||
IconSettings,
|
||||
IconShieldCheck,
|
||||
IconStethoscope,
|
||||
IconSun,
|
||||
IconTool,
|
||||
IconTrash,
|
||||
IconUpload,
|
||||
IconWand,
|
||||
IconWorld,
|
||||
IconX,
|
||||
IconXboxX
|
||||
} as const
|
||||
|
||||
export type DynamicIconName = keyof typeof icons
|
||||
|
|
@ -112,9 +112,7 @@ const CURATED_MAP_COLLECTIONS_KEY = 'curated-map-collections'
|
|||
const CURATED_CATEGORIES_KEY = 'curated-categories'
|
||||
const WIKIPEDIA_STATE_KEY = 'wikipedia-state'
|
||||
|
||||
export default function EasySetupWizard(props: {
|
||||
system: { services: ServiceSlim[]; remoteOllamaUrl: string }
|
||||
}) {
|
||||
export default function EasySetupWizard(props: { system: { services: ServiceSlim[] } }) {
|
||||
const { aiAssistantName } = usePage<{ aiAssistantName: string }>().props
|
||||
const CORE_CAPABILITIES = buildCoreCapabilities(aiAssistantName)
|
||||
|
||||
|
|
@ -124,11 +122,6 @@ export default function EasySetupWizard(props: {
|
|||
const [selectedAiModels, setSelectedAiModels] = useState<string[]>([])
|
||||
const [isProcessing, setIsProcessing] = useState(false)
|
||||
const [showAdditionalTools, setShowAdditionalTools] = useState(false)
|
||||
const [remoteOllamaEnabled, setRemoteOllamaEnabled] = useState(
|
||||
() => !!props.system.remoteOllamaUrl
|
||||
)
|
||||
const [remoteOllamaUrl, setRemoteOllamaUrl] = useState(() => props.system.remoteOllamaUrl ?? '')
|
||||
const [remoteOllamaUrlError, setRemoteOllamaUrlError] = useState<string | null>(null)
|
||||
|
||||
// Category/tier selection state
|
||||
const [selectedTiers, setSelectedTiers] = useState<Map<string, SpecTier>>(new Map())
|
||||
|
|
@ -338,24 +331,8 @@ export default function EasySetupWizard(props: {
|
|||
setIsProcessing(true)
|
||||
|
||||
try {
|
||||
// If using remote Ollama, configure it first before other installs
|
||||
if (remoteOllamaEnabled && remoteOllamaUrl) {
|
||||
const remoteResult = await api.configureRemoteOllama(remoteOllamaUrl)
|
||||
if (!remoteResult?.success) {
|
||||
const msg = (remoteResult as any)?.message || 'Failed to configure remote Ollama.'
|
||||
setRemoteOllamaUrlError(msg)
|
||||
setIsProcessing(false)
|
||||
setCurrentStep(1)
|
||||
return
|
||||
}
|
||||
}
|
||||
|
||||
// All of these ops don't actually wait for completion, they just kick off the process, so we can run them in parallel without awaiting each one sequentially
|
||||
// Exclude Ollama from local install when using remote mode
|
||||
const servicesToInstall = remoteOllamaEnabled
|
||||
? selectedServices.filter((s) => s !== SERVICE_NAMES.OLLAMA)
|
||||
: selectedServices
|
||||
const installPromises = servicesToInstall.map((serviceName) => api.installService(serviceName))
|
||||
const installPromises = selectedServices.map((serviceName) => api.installService(serviceName))
|
||||
|
||||
await Promise.all(installPromises)
|
||||
|
||||
|
|
@ -684,54 +661,10 @@ export default function EasySetupWizard(props: {
|
|||
<div>
|
||||
<h3 className="text-lg font-semibold text-text-primary mb-4">Core Capabilities</h3>
|
||||
<div className="grid grid-cols-1 lg:grid-cols-3 gap-4">
|
||||
{existingCoreCapabilities.map((capability) => {
|
||||
if (capability.id === 'ai') {
|
||||
const isAiSelected = isCapabilitySelected(capability)
|
||||
return (
|
||||
<div key={capability.id}>
|
||||
{renderCapabilityCard(capability, true)}
|
||||
{isAiSelected && !isCapabilityInstalled(capability) && (
|
||||
<div
|
||||
className="mt-2 p-4 bg-gray-50 rounded-lg border border-gray-200"
|
||||
onClick={(e) => e.stopPropagation()}
|
||||
>
|
||||
<label className="flex items-center gap-2 cursor-pointer select-none">
|
||||
<input
|
||||
type="checkbox"
|
||||
checked={remoteOllamaEnabled}
|
||||
onChange={(e) => {
|
||||
setRemoteOllamaEnabled(e.target.checked)
|
||||
setRemoteOllamaUrlError(null)
|
||||
}}
|
||||
className="w-4 h-4 accent-desert-green"
|
||||
/>
|
||||
<span className="text-sm font-medium text-gray-700">Use remote Ollama instance</span>
|
||||
</label>
|
||||
{remoteOllamaEnabled && (
|
||||
<div className="mt-3">
|
||||
<input
|
||||
type="text"
|
||||
value={remoteOllamaUrl}
|
||||
onChange={(e) => {
|
||||
setRemoteOllamaUrl(e.target.value)
|
||||
setRemoteOllamaUrlError(null)
|
||||
}}
|
||||
placeholder="http://192.168.1.100:11434"
|
||||
className="w-full px-3 py-2 text-sm border border-gray-300 rounded-md focus:outline-none focus:ring-1 focus:ring-desert-green"
|
||||
/>
|
||||
{remoteOllamaUrlError && (
|
||||
<p className="mt-1 text-xs text-red-600">{remoteOllamaUrlError}</p>
|
||||
{existingCoreCapabilities.map((capability) =>
|
||||
renderCapabilityCard(capability, true)
|
||||
)}
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
)
|
||||
}
|
||||
return renderCapabilityCard(capability, true)
|
||||
})}
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
|
||||
|
|
@ -844,14 +777,8 @@ export default function EasySetupWizard(props: {
|
|||
<p className="text-sm text-text-muted">Select models to download for offline AI</p>
|
||||
</div>
|
||||
</div>
|
||||
{remoteOllamaEnabled && remoteOllamaUrl ? (
|
||||
<Alert
|
||||
title="Remote Ollama selected"
|
||||
message="Models are managed on the remote machine. You can add models from Settings > AI Assistant after setup, note this is only supported when using Ollama, not LM Studio and other OpenAI API software."
|
||||
type="info"
|
||||
variant="bordered"
|
||||
/>
|
||||
) : isLoadingRecommendedModels ? (
|
||||
|
||||
{isLoadingRecommendedModels ? (
|
||||
<div className="flex justify-center py-12">
|
||||
<LoadingSpinner />
|
||||
</div>
|
||||
|
|
|
|||
|
|
@ -6,7 +6,7 @@ import {
|
|||
IconSettings,
|
||||
IconWifiOff,
|
||||
} from '@tabler/icons-react'
|
||||
import { Head, Link, router, usePage } from '@inertiajs/react'
|
||||
import { Head, usePage } from '@inertiajs/react'
|
||||
import AppLayout from '~/layouts/AppLayout'
|
||||
import { getServiceLink } from '~/lib/navigation'
|
||||
import { ServiceSlim } from '../../types/services'
|
||||
|
|
@ -146,7 +146,9 @@ export default function Home(props: {
|
|||
variant: 'primary',
|
||||
children: 'Go to Settings',
|
||||
icon: 'IconSettings',
|
||||
onClick: () => router.visit('/settings/update'),
|
||||
onClick: () => {
|
||||
window.location.href = '/settings/update'
|
||||
},
|
||||
}}
|
||||
/>
|
||||
</div>
|
||||
|
|
@ -157,7 +159,8 @@ export default function Home(props: {
|
|||
const isEasySetup = item.label === 'Easy Setup'
|
||||
const shouldHighlight = isEasySetup && shouldHighlightEasySetup
|
||||
|
||||
const tileContent = (
|
||||
return (
|
||||
<a key={item.label} href={item.to} target={item.target}>
|
||||
<div className="relative rounded border-desert-green border-2 bg-desert-green hover:bg-transparent hover:text-text-primary text-white transition-colors shadow-sm h-48 flex flex-col items-center justify-center cursor-pointer text-center px-4">
|
||||
{shouldHighlight && (
|
||||
<span className="absolute top-2 right-2 flex items-center justify-center">
|
||||
|
|
@ -175,16 +178,7 @@ export default function Home(props: {
|
|||
{item.poweredBy && <p className="text-sm opacity-80">Powered by {item.poweredBy}</p>}
|
||||
<p className="xl:text-lg mt-2">{item.description}</p>
|
||||
</div>
|
||||
)
|
||||
|
||||
return item.target === '_blank' ? (
|
||||
<a key={item.label} href={item.to} target="_blank" rel="noopener noreferrer">
|
||||
{tileContent}
|
||||
</a>
|
||||
) : (
|
||||
<Link key={item.label} href={item.to}>
|
||||
{tileContent}
|
||||
</Link>
|
||||
)
|
||||
})}
|
||||
</div>
|
||||
|
|
|
|||
|
|
@ -1,5 +1,5 @@
|
|||
import MapsLayout from '~/layouts/MapsLayout'
|
||||
import { Head, Link, router } from '@inertiajs/react'
|
||||
import { Head, Link } from '@inertiajs/react'
|
||||
import MapComponent from '~/components/maps/MapComponent'
|
||||
import StyledButton from '~/components/StyledButton'
|
||||
import { IconArrowLeft } from '@tabler/icons-react'
|
||||
|
|
@ -42,7 +42,9 @@ export default function Maps(props: {
|
|||
variant: 'secondary',
|
||||
children: 'Go to Map Settings',
|
||||
icon: 'IconSettings',
|
||||
onClick: () => router.visit('/settings/maps'),
|
||||
onClick: () => {
|
||||
window.location.href = '/settings/maps'
|
||||
},
|
||||
}}
|
||||
/>
|
||||
</div>
|
||||
|
|
|
|||
|
|
@ -16,10 +16,8 @@ import CuratedCollectionCard from '~/components/CuratedCollectionCard'
|
|||
import type { CollectionWithStatus } from '../../../types/collections'
|
||||
import ActiveDownloads from '~/components/ActiveDownloads'
|
||||
import Alert from '~/components/Alert'
|
||||
import { formatBytes } from '~/lib/util'
|
||||
|
||||
const CURATED_COLLECTIONS_KEY = 'curated-map-collections'
|
||||
const GLOBAL_MAP_INFO_KEY = 'global-map-info'
|
||||
|
||||
export default function MapsManager(props: {
|
||||
maps: { baseAssetsExist: boolean; regionFiles: FileEntry[] }
|
||||
|
|
@ -40,31 +38,6 @@ export default function MapsManager(props: {
|
|||
enabled: true,
|
||||
})
|
||||
|
||||
const { data: globalMapInfo } = useQuery({
|
||||
queryKey: [GLOBAL_MAP_INFO_KEY],
|
||||
queryFn: () => api.getGlobalMapInfo(),
|
||||
refetchOnWindowFocus: false,
|
||||
})
|
||||
|
||||
const downloadGlobalMap = useMutation({
|
||||
mutationFn: () => api.downloadGlobalMap(),
|
||||
onSuccess: () => {
|
||||
invalidateDownloads()
|
||||
addNotification({
|
||||
type: 'success',
|
||||
message: 'Global map download has been queued. This is a large file (~125 GB) and may take a while.',
|
||||
})
|
||||
closeAllModals()
|
||||
},
|
||||
onError: (error) => {
|
||||
console.error('Error downloading global map:', error)
|
||||
addNotification({
|
||||
type: 'error',
|
||||
message: 'Failed to start the global map download. Please try again.',
|
||||
})
|
||||
},
|
||||
})
|
||||
|
||||
async function downloadBaseAssets() {
|
||||
try {
|
||||
setDownloading(true)
|
||||
|
|
@ -173,29 +146,6 @@ export default function MapsManager(props: {
|
|||
)
|
||||
}
|
||||
|
||||
async function confirmGlobalMapDownload() {
|
||||
if (!globalMapInfo) return
|
||||
openModal(
|
||||
<StyledModal
|
||||
title="Download Global Map?"
|
||||
onConfirm={() => downloadGlobalMap.mutate()}
|
||||
onCancel={closeAllModals}
|
||||
open={true}
|
||||
confirmText="Download"
|
||||
cancelText="Cancel"
|
||||
confirmVariant="primary"
|
||||
confirmLoading={downloadGlobalMap.isPending}
|
||||
>
|
||||
<p className="text-text-secondary">
|
||||
This will download the full Protomaps global map ({formatBytes(globalMapInfo.size, 1)}, build {globalMapInfo.date}).
|
||||
Covers the entire planet so you won't need individual region files.
|
||||
Make sure you have enough disk space.
|
||||
</p>
|
||||
</StyledModal>,
|
||||
'confirm-global-map-download-modal'
|
||||
)
|
||||
}
|
||||
|
||||
async function openDownloadModal() {
|
||||
openModal(
|
||||
<DownloadURLModal
|
||||
|
|
@ -251,23 +201,6 @@ export default function MapsManager(props: {
|
|||
}}
|
||||
/>
|
||||
)}
|
||||
{globalMapInfo && (
|
||||
<Alert
|
||||
title="Global Map Coverage Available"
|
||||
message={`Download a complete worldwide map from Protomaps (${formatBytes(globalMapInfo.size, 1)}, build ${globalMapInfo.date}). This is a large file but covers the entire planet — no individual region downloads needed.`}
|
||||
type="info-inverted"
|
||||
variant="bordered"
|
||||
className="mt-8"
|
||||
icon="IconWorld"
|
||||
buttonProps={{
|
||||
variant: 'primary',
|
||||
children: 'Download Global Map',
|
||||
icon: 'IconCloudDownload',
|
||||
loading: downloadGlobalMap.isPending,
|
||||
onClick: () => confirmGlobalMapDownload(),
|
||||
}}
|
||||
/>
|
||||
)}
|
||||
<div className="mt-8 mb-6 flex items-center justify-between">
|
||||
<StyledSectionHeader title="Curated Map Regions" className="!mb-0" />
|
||||
<StyledButton
|
||||
|
|
|
|||
|
|
@ -10,14 +10,13 @@ import { useNotifications } from '~/context/NotificationContext'
|
|||
import api from '~/lib/api'
|
||||
import { useModals } from '~/context/ModalContext'
|
||||
import StyledModal from '~/components/StyledModal'
|
||||
import type { NomadInstalledModel } from '../../../types/ollama'
|
||||
import { ModelResponse } from 'ollama'
|
||||
import { SERVICE_NAMES } from '../../../constants/service_names'
|
||||
import Switch from '~/components/inputs/Switch'
|
||||
import StyledSectionHeader from '~/components/StyledSectionHeader'
|
||||
import { useMutation, useQuery } from '@tanstack/react-query'
|
||||
import Input from '~/components/inputs/Input'
|
||||
import { IconSearch, IconRefresh } from '@tabler/icons-react'
|
||||
import { formatBytes } from '~/lib/util'
|
||||
import useDebounce from '~/hooks/useDebounce'
|
||||
import ActiveModelDownloads from '~/components/ActiveModelDownloads'
|
||||
import { useSystemInfo } from '~/hooks/useSystemInfo'
|
||||
|
|
@ -25,8 +24,8 @@ import { useSystemInfo } from '~/hooks/useSystemInfo'
|
|||
export default function ModelsPage(props: {
|
||||
models: {
|
||||
availableModels: NomadOllamaModel[]
|
||||
installedModels: NomadInstalledModel[]
|
||||
settings: { chatSuggestionsEnabled: boolean; aiAssistantCustomName: string; remoteOllamaUrl: string; ollamaFlashAttention: boolean }
|
||||
installedModels: ModelResponse[]
|
||||
settings: { chatSuggestionsEnabled: boolean; aiAssistantCustomName: string }
|
||||
}
|
||||
}) {
|
||||
const { aiAssistantName } = usePage<{ aiAssistantName: string }>().props
|
||||
|
|
@ -95,49 +94,9 @@ export default function ModelsPage(props: {
|
|||
const [chatSuggestionsEnabled, setChatSuggestionsEnabled] = useState(
|
||||
props.models.settings.chatSuggestionsEnabled
|
||||
)
|
||||
const [ollamaFlashAttention, setOllamaFlashAttention] = useState(
|
||||
props.models.settings.ollamaFlashAttention
|
||||
)
|
||||
const [aiAssistantCustomName, setAiAssistantCustomName] = useState(
|
||||
props.models.settings.aiAssistantCustomName
|
||||
)
|
||||
const [remoteOllamaUrl, setRemoteOllamaUrl] = useState(props.models.settings.remoteOllamaUrl)
|
||||
const [remoteOllamaError, setRemoteOllamaError] = useState<string | null>(null)
|
||||
const [remoteOllamaSaving, setRemoteOllamaSaving] = useState(false)
|
||||
|
||||
async function handleSaveRemoteOllama() {
|
||||
setRemoteOllamaError(null)
|
||||
setRemoteOllamaSaving(true)
|
||||
try {
|
||||
const res = await api.configureRemoteOllama(remoteOllamaUrl || null)
|
||||
if (res?.success) {
|
||||
addNotification({ message: res.message, type: 'success' })
|
||||
router.reload()
|
||||
}
|
||||
} catch (error: any) {
|
||||
const msg = error?.response?.data?.message || error?.message || 'Failed to configure remote Ollama.'
|
||||
setRemoteOllamaError(msg)
|
||||
} finally {
|
||||
setRemoteOllamaSaving(false)
|
||||
}
|
||||
}
|
||||
|
||||
async function handleClearRemoteOllama() {
|
||||
setRemoteOllamaError(null)
|
||||
setRemoteOllamaSaving(true)
|
||||
try {
|
||||
const res = await api.configureRemoteOllama(null)
|
||||
if (res?.success) {
|
||||
setRemoteOllamaUrl('')
|
||||
addNotification({ message: 'Remote Ollama configuration cleared.', type: 'success' })
|
||||
router.reload()
|
||||
}
|
||||
} catch (error: any) {
|
||||
setRemoteOllamaError(error?.message || 'Failed to clear remote Ollama.')
|
||||
} finally {
|
||||
setRemoteOllamaSaving(false)
|
||||
}
|
||||
}
|
||||
|
||||
const [query, setQuery] = useState('')
|
||||
const [queryUI, setQueryUI] = useState('')
|
||||
|
|
@ -311,15 +270,6 @@ export default function ModelsPage(props: {
|
|||
label="Chat Suggestions"
|
||||
description="Display AI-generated conversation starters in the chat interface"
|
||||
/>
|
||||
<Switch
|
||||
checked={ollamaFlashAttention}
|
||||
onChange={(newVal) => {
|
||||
setOllamaFlashAttention(newVal)
|
||||
updateSettingMutation.mutate({ key: 'ai.ollamaFlashAttention', value: newVal })
|
||||
}}
|
||||
label="Flash Attention"
|
||||
description="Enables OLLAMA_FLASH_ATTENTION=1 for improved memory efficiency. Disable if you experience instability. Takes effect after reinstalling the AI Assistant."
|
||||
/>
|
||||
<Input
|
||||
name="aiAssistantCustomName"
|
||||
label="Assistant Name"
|
||||
|
|
@ -336,119 +286,9 @@ export default function ModelsPage(props: {
|
|||
/>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<StyledSectionHeader title="Installed Models" className="mt-12 mb-4" />
|
||||
<div className="bg-surface-primary rounded-lg border-2 border-border-subtle p-6">
|
||||
{props.models.installedModels.length === 0 ? (
|
||||
<p className="text-text-muted">
|
||||
No models installed. Browse the model catalog below to get started.
|
||||
</p>
|
||||
) : (
|
||||
<table className="min-w-full divide-y divide-border-subtle">
|
||||
<thead>
|
||||
<tr>
|
||||
<th className="px-4 py-3 text-left text-xs font-medium text-text-muted uppercase tracking-wider">
|
||||
Model
|
||||
</th>
|
||||
<th className="px-4 py-3 text-left text-xs font-medium text-text-muted uppercase tracking-wider">
|
||||
Parameters
|
||||
</th>
|
||||
<th className="px-4 py-3 text-left text-xs font-medium text-text-muted uppercase tracking-wider">
|
||||
Disk Size
|
||||
</th>
|
||||
<th className="px-4 py-3 text-right text-xs font-medium text-text-muted uppercase tracking-wider">
|
||||
Action
|
||||
</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody className="divide-y divide-border-subtle">
|
||||
{props.models.installedModels.map((model) => (
|
||||
<tr key={model.name} className="hover:bg-surface-secondary">
|
||||
<td className="px-4 py-3">
|
||||
<span className="text-sm font-medium text-text-primary">{model.name}</span>
|
||||
</td>
|
||||
<td className="px-4 py-3">
|
||||
<span className="text-sm text-text-secondary">
|
||||
{model.details.parameter_size || 'N/A'}
|
||||
</span>
|
||||
</td>
|
||||
<td className="px-4 py-3">
|
||||
<span className="text-sm text-text-secondary">
|
||||
{formatBytes(model.size)}
|
||||
</span>
|
||||
</td>
|
||||
<td className="px-4 py-3 text-right">
|
||||
<StyledButton
|
||||
variant="danger"
|
||||
size="sm"
|
||||
onClick={() => confirmDeleteModel(model.name)}
|
||||
icon="IconTrash"
|
||||
>
|
||||
Delete
|
||||
</StyledButton>
|
||||
</td>
|
||||
</tr>
|
||||
))}
|
||||
</tbody>
|
||||
</table>
|
||||
)}
|
||||
</div>
|
||||
|
||||
<StyledSectionHeader title="Remote Connection" className="mt-8 mb-4" />
|
||||
<div className="bg-surface-primary rounded-lg border-2 border-border-subtle p-6">
|
||||
<p className="text-sm text-text-secondary mb-4">
|
||||
Connect to any OpenAI-compatible API server — Ollama, LM Studio, llama.cpp, and others are all supported.
|
||||
For remote Ollama instances, the host must be started with <code className="bg-surface-secondary px-1 rounded">OLLAMA_HOST=0.0.0.0</code>.
|
||||
</p>
|
||||
<div className="flex items-end gap-3">
|
||||
<div className="flex-1">
|
||||
<Input
|
||||
name="remoteOllamaUrl"
|
||||
label="Remote Ollama/OpenAI API URL"
|
||||
placeholder="http://192.168.1.100:11434 (or :1234 for OpenAI API Compatible Apps)"
|
||||
value={remoteOllamaUrl}
|
||||
onChange={(e) => {
|
||||
setRemoteOllamaUrl(e.target.value)
|
||||
setRemoteOllamaError(null)
|
||||
}}
|
||||
/>
|
||||
{remoteOllamaError && (
|
||||
<p className="text-sm text-red-600 mt-1">{remoteOllamaError}</p>
|
||||
)}
|
||||
</div>
|
||||
<StyledButton
|
||||
variant="primary"
|
||||
onClick={handleSaveRemoteOllama}
|
||||
loading={remoteOllamaSaving}
|
||||
disabled={remoteOllamaSaving || !remoteOllamaUrl}
|
||||
className="mb-0.5"
|
||||
>
|
||||
Save & Test
|
||||
</StyledButton>
|
||||
{props.models.settings.remoteOllamaUrl && (
|
||||
<StyledButton
|
||||
variant="danger"
|
||||
onClick={handleClearRemoteOllama}
|
||||
loading={remoteOllamaSaving}
|
||||
disabled={remoteOllamaSaving}
|
||||
className="mb-0.5"
|
||||
>
|
||||
Clear
|
||||
</StyledButton>
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<ActiveModelDownloads withHeader />
|
||||
|
||||
<StyledSectionHeader title="Models" className="mt-12 mb-4" />
|
||||
<Alert
|
||||
type="info"
|
||||
variant="bordered"
|
||||
title="Model downloading is only supported when using a Ollama backend."
|
||||
message="If you are connected to an OpenAI API host (e.g. LM Studio), please download models directly in that application."
|
||||
className="mb-4"
|
||||
/>
|
||||
<div className="flex justify-start items-center gap-3 mt-4">
|
||||
<Input
|
||||
name="search"
|
||||
|
|
|
|||
|
|
@ -42,7 +42,7 @@ export default function SupportPage() {
|
|||
className="block mb-4 rounded-lg overflow-hidden hover:opacity-90 transition-opacity"
|
||||
>
|
||||
<img
|
||||
src="/rogue-support-banner.webp"
|
||||
src="/rogue-support-banner.png"
|
||||
alt="Rogue Support — Conquer Your Home Network"
|
||||
className="w-full"
|
||||
/>
|
||||
|
|
|
|||
341
admin/package-lock.json
generated
|
|
@ -42,21 +42,18 @@
|
|||
"better-sqlite3": "^12.1.1",
|
||||
"bullmq": "^5.65.1",
|
||||
"cheerio": "^1.2.0",
|
||||
"compression": "^1.8.1",
|
||||
"dockerode": "^4.0.7",
|
||||
"edge.js": "^6.2.1",
|
||||
"fast-xml-parser": "^5.5.7",
|
||||
"fast-xml-parser": "^5.5.6",
|
||||
"fuse.js": "^7.1.0",
|
||||
"jszip": "^3.10.1",
|
||||
"luxon": "^3.6.1",
|
||||
"maplibre-gl": "^4.7.1",
|
||||
"mysql2": "^3.14.1",
|
||||
"ollama": "^0.6.3",
|
||||
"openai": "^6.27.0",
|
||||
"pdf-parse": "^2.4.5",
|
||||
"pdf2pic": "^3.2.0",
|
||||
"pino-pretty": "^13.0.0",
|
||||
"pmtiles": "^4.4.0",
|
||||
"pmtiles": "^4.3.0",
|
||||
"postcss": "^8.5.6",
|
||||
"react": "^19.1.0",
|
||||
"react-adonis-transmit": "^1.0.1",
|
||||
|
|
@ -68,11 +65,11 @@
|
|||
"sharp": "^0.34.5",
|
||||
"stopword": "^3.1.5",
|
||||
"systeminformation": "^5.31.0",
|
||||
"tailwindcss": "^4.2.1",
|
||||
"tailwindcss": "^4.1.10",
|
||||
"tar": "^7.5.11",
|
||||
"tesseract.js": "^7.0.0",
|
||||
"url-join": "^5.0.0",
|
||||
"yaml": "^2.8.3"
|
||||
"yaml": "^2.8.0"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@adonisjs/assembler": "^7.8.2",
|
||||
|
|
@ -84,8 +81,7 @@
|
|||
"@japa/runner": "^4.2.0",
|
||||
"@swc/core": "1.11.24",
|
||||
"@tanstack/eslint-plugin-query": "^5.81.2",
|
||||
"@types/compression": "^1.8.1",
|
||||
"@types/dockerode": "^4.0.1",
|
||||
"@types/dockerode": "^3.3.41",
|
||||
"@types/luxon": "^3.6.2",
|
||||
"@types/node": "^22.15.18",
|
||||
"@types/react": "^19.1.8",
|
||||
|
|
@ -4613,12 +4609,6 @@
|
|||
"tailwindcss": "4.1.18"
|
||||
}
|
||||
},
|
||||
"node_modules/@tailwindcss/node/node_modules/tailwindcss": {
|
||||
"version": "4.1.18",
|
||||
"resolved": "https://registry.npmjs.org/tailwindcss/-/tailwindcss-4.1.18.tgz",
|
||||
"integrity": "sha512-4+Z+0yiYyEtUVCScyfHCxOYP06L5Ne+JiHhY2IjR2KWMIWhJOYZKLSGZaP5HkZ8+bY0cxfzwDE5uOmzFXyIwxw==",
|
||||
"license": "MIT"
|
||||
},
|
||||
"node_modules/@tailwindcss/oxide": {
|
||||
"version": "4.1.18",
|
||||
"resolved": "https://registry.npmjs.org/@tailwindcss/oxide/-/oxide-4.1.18.tgz",
|
||||
|
|
@ -4915,12 +4905,6 @@
|
|||
"vite": "^5.2.0 || ^6 || ^7"
|
||||
}
|
||||
},
|
||||
"node_modules/@tailwindcss/vite/node_modules/tailwindcss": {
|
||||
"version": "4.1.18",
|
||||
"resolved": "https://registry.npmjs.org/tailwindcss/-/tailwindcss-4.1.18.tgz",
|
||||
"integrity": "sha512-4+Z+0yiYyEtUVCScyfHCxOYP06L5Ne+JiHhY2IjR2KWMIWhJOYZKLSGZaP5HkZ8+bY0cxfzwDE5uOmzFXyIwxw==",
|
||||
"license": "MIT"
|
||||
},
|
||||
"node_modules/@tanstack/eslint-plugin-query": {
|
||||
"version": "5.91.4",
|
||||
"resolved": "https://registry.npmjs.org/@tanstack/eslint-plugin-query/-/eslint-plugin-query-5.91.4.tgz",
|
||||
|
|
@ -5146,17 +5130,6 @@
|
|||
"@types/node": "*"
|
||||
}
|
||||
},
|
||||
"node_modules/@types/body-parser": {
|
||||
"version": "1.19.6",
|
||||
"resolved": "https://registry.npmjs.org/@types/body-parser/-/body-parser-1.19.6.tgz",
|
||||
"integrity": "sha512-HLFeCYgz89uk22N5Qg3dvGvsv46B8GLvKKo1zKG4NybA8U2DiEO3w9lqGg29t/tfLRJpJ6iQxnVw4OnB7MoM9g==",
|
||||
"dev": true,
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"@types/connect": "*",
|
||||
"@types/node": "*"
|
||||
}
|
||||
},
|
||||
"node_modules/@types/chai": {
|
||||
"version": "5.2.3",
|
||||
"resolved": "https://registry.npmjs.org/@types/chai/-/chai-5.2.3.tgz",
|
||||
|
|
@ -5168,27 +5141,6 @@
|
|||
"assertion-error": "^2.0.1"
|
||||
}
|
||||
},
|
||||
"node_modules/@types/compression": {
|
||||
"version": "1.8.1",
|
||||
"resolved": "https://registry.npmjs.org/@types/compression/-/compression-1.8.1.tgz",
|
||||
"integrity": "sha512-kCFuWS0ebDbmxs0AXYn6e2r2nrGAb5KwQhknjSPSPgJcGd8+HVSILlUyFhGqML2gk39HcG7D1ydW9/qpYkN00Q==",
|
||||
"dev": true,
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"@types/express": "*",
|
||||
"@types/node": "*"
|
||||
}
|
||||
},
|
||||
"node_modules/@types/connect": {
|
||||
"version": "3.4.38",
|
||||
"resolved": "https://registry.npmjs.org/@types/connect/-/connect-3.4.38.tgz",
|
||||
"integrity": "sha512-K6uROf1LD88uDQqJCktA4yzL1YYAK6NgfsI0v/mTgyPKWsX1CnJ0XPSDhViejru1GcRkLWb8RlzFYJRqGUbaug==",
|
||||
"dev": true,
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"@types/node": "*"
|
||||
}
|
||||
},
|
||||
"node_modules/@types/debug": {
|
||||
"version": "4.1.12",
|
||||
"resolved": "https://registry.npmjs.org/@types/debug/-/debug-4.1.12.tgz",
|
||||
|
|
@ -5217,9 +5169,9 @@
|
|||
}
|
||||
},
|
||||
"node_modules/@types/dockerode": {
|
||||
"version": "4.0.1",
|
||||
"resolved": "https://registry.npmjs.org/@types/dockerode/-/dockerode-4.0.1.tgz",
|
||||
"integrity": "sha512-cmUpB+dPN955PxBEuXE3f6lKO1hHiIGYJA46IVF3BJpNsZGvtBDcRnlrHYHtOH/B6vtDOyl2kZ2ShAu3mgc27Q==",
|
||||
"version": "3.3.47",
|
||||
"resolved": "https://registry.npmjs.org/@types/dockerode/-/dockerode-3.3.47.tgz",
|
||||
"integrity": "sha512-ShM1mz7rCjdssXt7Xz0u1/R2BJC7piWa3SJpUBiVjCf2A3XNn4cP6pUVaD8bLanpPVVn4IKzJuw3dOvkJ8IbYw==",
|
||||
"dev": true,
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
|
|
@ -5243,31 +5195,6 @@
|
|||
"@types/estree": "*"
|
||||
}
|
||||
},
|
||||
"node_modules/@types/express": {
|
||||
"version": "5.0.6",
|
||||
"resolved": "https://registry.npmjs.org/@types/express/-/express-5.0.6.tgz",
|
||||
"integrity": "sha512-sKYVuV7Sv9fbPIt/442koC7+IIwK5olP1KWeD88e/idgoJqDm3JV/YUiPwkoKK92ylff2MGxSz1CSjsXelx0YA==",
|
||||
"dev": true,
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"@types/body-parser": "*",
|
||||
"@types/express-serve-static-core": "^5.0.0",
|
||||
"@types/serve-static": "^2"
|
||||
}
|
||||
},
|
||||
"node_modules/@types/express-serve-static-core": {
|
||||
"version": "5.1.1",
|
||||
"resolved": "https://registry.npmjs.org/@types/express-serve-static-core/-/express-serve-static-core-5.1.1.tgz",
|
||||
"integrity": "sha512-v4zIMr/cX7/d2BpAEX3KNKL/JrT1s43s96lLvvdTmza1oEvDudCqK9aF/djc/SWgy8Yh0h30TZx5VpzqFCxk5A==",
|
||||
"dev": true,
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"@types/node": "*",
|
||||
"@types/qs": "*",
|
||||
"@types/range-parser": "*",
|
||||
"@types/send": "*"
|
||||
}
|
||||
},
|
||||
"node_modules/@types/geojson": {
|
||||
"version": "7946.0.16",
|
||||
"resolved": "https://registry.npmjs.org/@types/geojson/-/geojson-7946.0.16.tgz",
|
||||
|
|
@ -5298,13 +5225,6 @@
|
|||
"integrity": "sha512-q67/qwlxblDzEDvzHhVkwc1gzVWxaNxeyHUBF4xElrvjL11O+Ytze+1fGpBHlr/H9myiBUaUXNnNPmBHxxfAcA==",
|
||||
"license": "MIT"
|
||||
},
|
||||
"node_modules/@types/http-errors": {
|
||||
"version": "2.0.5",
|
||||
"resolved": "https://registry.npmjs.org/@types/http-errors/-/http-errors-2.0.5.tgz",
|
||||
"integrity": "sha512-r8Tayk8HJnX0FztbZN7oVqGccWgw98T/0neJphO91KkmOzug1KkofZURD4UaD5uH8AqcFLfdPErnBod0u71/qg==",
|
||||
"dev": true,
|
||||
"license": "MIT"
|
||||
},
|
||||
"node_modules/@types/istanbul-lib-coverage": {
|
||||
"version": "2.0.6",
|
||||
"resolved": "https://registry.npmjs.org/@types/istanbul-lib-coverage/-/istanbul-lib-coverage-2.0.6.tgz",
|
||||
|
|
@ -5459,13 +5379,6 @@
|
|||
"integrity": "sha512-eOunJqu0K1923aExK6y8p6fsihYEn/BYuQ4g0CxAAgFc4b/ZLN4CrsRZ55srTdqoiLzU2B2evC+apEIxprEzkQ==",
|
||||
"license": "MIT"
|
||||
},
|
||||
"node_modules/@types/range-parser": {
|
||||
"version": "1.2.7",
|
||||
"resolved": "https://registry.npmjs.org/@types/range-parser/-/range-parser-1.2.7.tgz",
|
||||
"integrity": "sha512-hKormJbkJqzQGhziax5PItDUTMAM9uE2XXQmM37dyd4hVM+5aVl7oVxMVUiVQn2oCQFN/LKCZdvSM0pFRqbSmQ==",
|
||||
"dev": true,
|
||||
"license": "MIT"
|
||||
},
|
||||
"node_modules/@types/react": {
|
||||
"version": "19.2.10",
|
||||
"resolved": "https://registry.npmjs.org/@types/react/-/react-19.2.10.tgz",
|
||||
|
|
@ -5485,27 +5398,6 @@
|
|||
"@types/react": "^19.2.0"
|
||||
}
|
||||
},
|
||||
"node_modules/@types/send": {
|
||||
"version": "1.2.1",
|
||||
"resolved": "https://registry.npmjs.org/@types/send/-/send-1.2.1.tgz",
|
||||
"integrity": "sha512-arsCikDvlU99zl1g69TcAB3mzZPpxgw0UQnaHeC1Nwb015xp8bknZv5rIfri9xTOcMuaVgvabfIRA7PSZVuZIQ==",
|
||||
"dev": true,
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"@types/node": "*"
|
||||
}
|
||||
},
|
||||
"node_modules/@types/serve-static": {
|
||||
"version": "2.2.0",
|
||||
"resolved": "https://registry.npmjs.org/@types/serve-static/-/serve-static-2.2.0.tgz",
|
||||
"integrity": "sha512-8mam4H1NHLtu7nmtalF7eyBH14QyOASmcxHhSfEoRyr0nP/YdoesEtU+uSRvMe96TW/HPTtkoKqQLl53N7UXMQ==",
|
||||
"dev": true,
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"@types/http-errors": "*",
|
||||
"@types/node": "*"
|
||||
}
|
||||
},
|
||||
"node_modules/@types/ssh2": {
|
||||
"version": "1.15.5",
|
||||
"resolved": "https://registry.npmjs.org/@types/ssh2/-/ssh2-1.15.5.tgz",
|
||||
|
|
@ -7317,60 +7209,6 @@
|
|||
"devOptional": true,
|
||||
"license": "ISC"
|
||||
},
|
||||
"node_modules/compressible": {
|
||||
"version": "2.0.18",
|
||||
"resolved": "https://registry.npmjs.org/compressible/-/compressible-2.0.18.tgz",
|
||||
"integrity": "sha512-AF3r7P5dWxL8MxyITRMlORQNaOA2IkAFaTr4k7BUumjPtRpGDTZpl0Pb1XCO6JeDCBdp126Cgs9sMxqSjgYyRg==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"mime-db": ">= 1.43.0 < 2"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">= 0.6"
|
||||
}
|
||||
},
|
||||
"node_modules/compression": {
|
||||
"version": "1.8.1",
|
||||
"resolved": "https://registry.npmjs.org/compression/-/compression-1.8.1.tgz",
|
||||
"integrity": "sha512-9mAqGPHLakhCLeNyxPkK4xVo746zQ/czLH1Ky+vkitMnWfWZps8r0qXuwhwizagCRttsL4lfG4pIOvaWLpAP0w==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"bytes": "3.1.2",
|
||||
"compressible": "~2.0.18",
|
||||
"debug": "2.6.9",
|
||||
"negotiator": "~0.6.4",
|
||||
"on-headers": "~1.1.0",
|
||||
"safe-buffer": "5.2.1",
|
||||
"vary": "~1.1.2"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">= 0.8.0"
|
||||
}
|
||||
},
|
||||
"node_modules/compression/node_modules/debug": {
|
||||
"version": "2.6.9",
|
||||
"resolved": "https://registry.npmjs.org/debug/-/debug-2.6.9.tgz",
|
||||
"integrity": "sha512-bC7ElrdJaJnPbAP+1EotYvqZsb3ecl5wi6Bfi6BJTUcNowp6cvspg0jXznRTKDjm/E7AdgFBVeAPVMNcKGsHMA==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"ms": "2.0.0"
|
||||
}
|
||||
},
|
||||
"node_modules/compression/node_modules/ms": {
|
||||
"version": "2.0.0",
|
||||
"resolved": "https://registry.npmjs.org/ms/-/ms-2.0.0.tgz",
|
||||
"integrity": "sha512-Tpp60P6IUJDTuOq/5Z8cdskzJujfwqfOTkrwIwj7IRISpnkJnT6SyJ4PCPnGMoFjC9ddhal5KVIYtAt97ix05A==",
|
||||
"license": "MIT"
|
||||
},
|
||||
"node_modules/compression/node_modules/negotiator": {
|
||||
"version": "0.6.4",
|
||||
"resolved": "https://registry.npmjs.org/negotiator/-/negotiator-0.6.4.tgz",
|
||||
"integrity": "sha512-myRT3DiWPHqho5PrJaIRyaMv2kgYf0mUVgBNOYMuCH5Ki1yEiQaf/ZJuQ62nvpc44wL5WDbTX7yGJi1Neevw8w==",
|
||||
"license": "MIT",
|
||||
"engines": {
|
||||
"node": ">= 0.6"
|
||||
}
|
||||
},
|
||||
"node_modules/concat-map": {
|
||||
"version": "0.0.1",
|
||||
"resolved": "https://registry.npmjs.org/concat-map/-/concat-map-0.0.1.tgz",
|
||||
|
|
@ -7469,12 +7307,6 @@
|
|||
"url": "https://opencollective.com/core-js"
|
||||
}
|
||||
},
|
||||
"node_modules/core-util-is": {
|
||||
"version": "1.0.3",
|
||||
"resolved": "https://registry.npmjs.org/core-util-is/-/core-util-is-1.0.3.tgz",
|
||||
"integrity": "sha512-ZQBvi1DcpJ4GDqanjucZ2Hj3wEO5pZDS89BWbkcrvdxksJorwUDDZamX9ldFkp9aw2lmBDLgkObEA4DWNJ9FYQ==",
|
||||
"license": "MIT"
|
||||
},
|
||||
"node_modules/cpu-features": {
|
||||
"version": "0.0.10",
|
||||
"resolved": "https://registry.npmjs.org/cpu-features/-/cpu-features-0.0.10.tgz",
|
||||
|
|
@ -8803,9 +8635,9 @@
|
|||
}
|
||||
},
|
||||
"node_modules/fast-xml-parser": {
|
||||
"version": "5.5.9",
|
||||
"resolved": "https://registry.npmjs.org/fast-xml-parser/-/fast-xml-parser-5.5.9.tgz",
|
||||
"integrity": "sha512-jldvxr1MC6rtiZKgrFnDSvT8xuH+eJqxqOBThUVjYrxssYTo1avZLGql5l0a0BAERR01CadYzZ83kVEkbyDg+g==",
|
||||
"version": "5.5.6",
|
||||
"resolved": "https://registry.npmjs.org/fast-xml-parser/-/fast-xml-parser-5.5.6.tgz",
|
||||
"integrity": "sha512-3+fdZyBRVg29n4rXP0joHthhcHdPUHaIC16cuyyd1iLsuaO6Vea36MPrxgAzbZna8lhvZeRL8Bc9GP56/J9xEw==",
|
||||
"funding": [
|
||||
{
|
||||
"type": "github",
|
||||
|
|
@ -8815,8 +8647,8 @@
|
|||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"fast-xml-builder": "^1.1.4",
|
||||
"path-expression-matcher": "^1.2.0",
|
||||
"strnum": "^2.2.2"
|
||||
"path-expression-matcher": "^1.1.3",
|
||||
"strnum": "^2.1.2"
|
||||
},
|
||||
"bin": {
|
||||
"fxparser": "src/cli/cli.js"
|
||||
|
|
@ -8894,9 +8726,9 @@
|
|||
}
|
||||
},
|
||||
"node_modules/file-type": {
|
||||
"version": "21.3.2",
|
||||
"resolved": "https://registry.npmjs.org/file-type/-/file-type-21.3.2.tgz",
|
||||
"integrity": "sha512-DLkUvGwep3poOV2wpzbHCOnSKGk1LzyXTv+aHFgN2VFl96wnp8YA9YjO2qPzg5PuL8q/SW9Pdi6WTkYOIh995w==",
|
||||
"version": "21.3.0",
|
||||
"resolved": "https://registry.npmjs.org/file-type/-/file-type-21.3.0.tgz",
|
||||
"integrity": "sha512-8kPJMIGz1Yt/aPEwOsrR97ZyZaD1Iqm8PClb1nYFclUCkBi0Ma5IsYNQzvSFS9ib51lWyIw5mIT9rWzI/xjpzA==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"@tokenizer/inflate": "^0.4.1",
|
||||
|
|
@ -9978,12 +9810,6 @@
|
|||
"node": ">= 4"
|
||||
}
|
||||
},
|
||||
"node_modules/immediate": {
|
||||
"version": "3.0.6",
|
||||
"resolved": "https://registry.npmjs.org/immediate/-/immediate-3.0.6.tgz",
|
||||
"integrity": "sha512-XXOFtyqDjNDAQxVfYxuF7g9Il/IbWmmlQg2MYKOH8ExIT1qg6xc4zyS3HaEEATgs1btfzxq15ciUiY7gjSXRGQ==",
|
||||
"license": "MIT"
|
||||
},
|
||||
"node_modules/import-fresh": {
|
||||
"version": "3.3.1",
|
||||
"resolved": "https://registry.npmjs.org/import-fresh/-/import-fresh-3.3.1.tgz",
|
||||
|
|
@ -10308,12 +10134,6 @@
|
|||
"integrity": "sha512-ITvGim8FhRiYe4IQ5uHSkj7pVaPDrCTkNd3yq3cV7iZAcJdHTUMPMEHcqSOy9xZ9qFenQCvi+2wjH9a1nXqHww==",
|
||||
"license": "MIT"
|
||||
},
|
||||
"node_modules/isarray": {
|
||||
"version": "1.0.0",
|
||||
"resolved": "https://registry.npmjs.org/isarray/-/isarray-1.0.0.tgz",
|
||||
"integrity": "sha512-VLghIWNM6ELQzo7zwmcg0NmTVyWKYjvIeM83yjp0wRDTmUnrM678fQbcKBo6n2CJEF0szoG//ytg+TKla89ALQ==",
|
||||
"license": "MIT"
|
||||
},
|
||||
"node_modules/isexe": {
|
||||
"version": "2.0.0",
|
||||
"resolved": "https://registry.npmjs.org/isexe/-/isexe-2.0.0.tgz",
|
||||
|
|
@ -10546,48 +10366,6 @@
|
|||
"node": "*"
|
||||
}
|
||||
},
|
||||
"node_modules/jszip": {
|
||||
"version": "3.10.1",
|
||||
"resolved": "https://registry.npmjs.org/jszip/-/jszip-3.10.1.tgz",
|
||||
"integrity": "sha512-xXDvecyTpGLrqFrvkrUSoxxfJI5AH7U8zxxtVclpsUtMCq4JQ290LY8AW5c7Ggnr/Y/oK+bQMbqK2qmtk3pN4g==",
|
||||
"license": "(MIT OR GPL-3.0-or-later)",
|
||||
"dependencies": {
|
||||
"lie": "~3.3.0",
|
||||
"pako": "~1.0.2",
|
||||
"readable-stream": "~2.3.6",
|
||||
"setimmediate": "^1.0.5"
|
||||
}
|
||||
},
|
||||
"node_modules/jszip/node_modules/readable-stream": {
|
||||
"version": "2.3.8",
|
||||
"resolved": "https://registry.npmjs.org/readable-stream/-/readable-stream-2.3.8.tgz",
|
||||
"integrity": "sha512-8p0AUk4XODgIewSi0l8Epjs+EVnWiK7NoDIEGU0HhE7+ZyY8D1IMY7odu5lRrFXGg71L15KG8QrPmum45RTtdA==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"core-util-is": "~1.0.0",
|
||||
"inherits": "~2.0.3",
|
||||
"isarray": "~1.0.0",
|
||||
"process-nextick-args": "~2.0.0",
|
||||
"safe-buffer": "~5.1.1",
|
||||
"string_decoder": "~1.1.1",
|
||||
"util-deprecate": "~1.0.1"
|
||||
}
|
||||
},
|
||||
"node_modules/jszip/node_modules/safe-buffer": {
|
||||
"version": "5.1.2",
|
||||
"resolved": "https://registry.npmjs.org/safe-buffer/-/safe-buffer-5.1.2.tgz",
|
||||
"integrity": "sha512-Gd2UZBJDkXlY7GbJxfsE8/nvKkUEU1G38c1siN6QP6a9PT9MmHB8GnpscSmMJSoF8LOIrt8ud/wPtojys4G6+g==",
|
||||
"license": "MIT"
|
||||
},
|
||||
"node_modules/jszip/node_modules/string_decoder": {
|
||||
"version": "1.1.1",
|
||||
"resolved": "https://registry.npmjs.org/string_decoder/-/string_decoder-1.1.1.tgz",
|
||||
"integrity": "sha512-n/ShnvDi6FHbbVfviro+WojiFzv+s8MPMHBczVePfUpDJLwoLT0ht1l4YwBCbi8pJAveEEdnkHyPyTP/mzRfwg==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"safe-buffer": "~5.1.0"
|
||||
}
|
||||
},
|
||||
"node_modules/junk": {
|
||||
"version": "4.0.1",
|
||||
"resolved": "https://registry.npmjs.org/junk/-/junk-4.0.1.tgz",
|
||||
|
|
@ -10755,15 +10533,6 @@
|
|||
"node": ">= 0.8.0"
|
||||
}
|
||||
},
|
||||
"node_modules/lie": {
|
||||
"version": "3.3.0",
|
||||
"resolved": "https://registry.npmjs.org/lie/-/lie-3.3.0.tgz",
|
||||
"integrity": "sha512-UaiMJzeWRlEujzAuw5LokY1L5ecNQYZKfmyZ9L7wDHb/p5etKaxXhohBcrw0EYby+G/NA52vRSN4N39dxHAIwQ==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"immediate": "~3.0.5"
|
||||
}
|
||||
},
|
||||
"node_modules/lightningcss": {
|
||||
"version": "1.30.2",
|
||||
"resolved": "https://registry.npmjs.org/lightningcss/-/lightningcss-1.30.2.tgz",
|
||||
|
|
@ -12847,15 +12616,6 @@
|
|||
"node": ">= 0.8"
|
||||
}
|
||||
},
|
||||
"node_modules/on-headers": {
|
||||
"version": "1.1.0",
|
||||
"resolved": "https://registry.npmjs.org/on-headers/-/on-headers-1.1.0.tgz",
|
||||
"integrity": "sha512-737ZY3yNnXy37FHkQxPzt4UZ2UWPWiCZWLvFZ4fu5cueciegX0zGPnrlY6bwRg4FdQOe9YU8MkmJwGhoMybl8A==",
|
||||
"license": "MIT",
|
||||
"engines": {
|
||||
"node": ">= 0.8"
|
||||
}
|
||||
},
|
||||
"node_modules/once": {
|
||||
"version": "1.4.0",
|
||||
"resolved": "https://registry.npmjs.org/once/-/once-1.4.0.tgz",
|
||||
|
|
@ -12880,27 +12640,6 @@
|
|||
"url": "https://github.com/sponsors/sindresorhus"
|
||||
}
|
||||
},
|
||||
"node_modules/openai": {
|
||||
"version": "6.27.0",
|
||||
"resolved": "https://registry.npmjs.org/openai/-/openai-6.27.0.tgz",
|
||||
"integrity": "sha512-osTKySlrdYrLYTt0zjhY8yp0JUBmWDCN+Q+QxsV4xMQnnoVFpylgKGgxwN8sSdTNw0G4y+WUXs4eCMWpyDNWZQ==",
|
||||
"license": "Apache-2.0",
|
||||
"bin": {
|
||||
"openai": "bin/cli"
|
||||
},
|
||||
"peerDependencies": {
|
||||
"ws": "^8.18.0",
|
||||
"zod": "^3.25 || ^4.0"
|
||||
},
|
||||
"peerDependenciesMeta": {
|
||||
"ws": {
|
||||
"optional": true
|
||||
},
|
||||
"zod": {
|
||||
"optional": true
|
||||
}
|
||||
}
|
||||
},
|
||||
"node_modules/opencollective-postinstall": {
|
||||
"version": "2.0.3",
|
||||
"resolved": "https://registry.npmjs.org/opencollective-postinstall/-/opencollective-postinstall-2.0.3.tgz",
|
||||
|
|
@ -13046,12 +12785,6 @@
|
|||
"quansync": "^0.2.7"
|
||||
}
|
||||
},
|
||||
"node_modules/pako": {
|
||||
"version": "1.0.11",
|
||||
"resolved": "https://registry.npmjs.org/pako/-/pako-1.0.11.tgz",
|
||||
"integrity": "sha512-4hLB8Py4zZce5s4yd9XzopqwVv/yGNhV1Bl8NTmCq1763HeK2+EwVTv+leGeL13Dnh2wfbqowVPXCIO0z4taYw==",
|
||||
"license": "(MIT AND Zlib)"
|
||||
},
|
||||
"node_modules/parent-module": {
|
||||
"version": "1.0.1",
|
||||
"resolved": "https://registry.npmjs.org/parent-module/-/parent-module-1.0.1.tgz",
|
||||
|
|
@ -13223,9 +12956,9 @@
|
|||
}
|
||||
},
|
||||
"node_modules/path-expression-matcher": {
|
||||
"version": "1.2.0",
|
||||
"resolved": "https://registry.npmjs.org/path-expression-matcher/-/path-expression-matcher-1.2.0.tgz",
|
||||
"integrity": "sha512-DwmPWeFn+tq7TiyJ2CxezCAirXjFxvaiD03npak3cRjlP9+OjTmSy1EpIrEbh+l6JgUundniloMLDQ/6VTdhLQ==",
|
||||
"version": "1.1.3",
|
||||
"resolved": "https://registry.npmjs.org/path-expression-matcher/-/path-expression-matcher-1.1.3.tgz",
|
||||
"integrity": "sha512-qdVgY8KXmVdJZRSS1JdEPOKPdTiEK/pi0RkcT2sw1RhXxohdujUlJFPuS1TSkevZ9vzd3ZlL7ULl1MHGTApKzQ==",
|
||||
"funding": [
|
||||
{
|
||||
"type": "github",
|
||||
|
|
@ -13471,9 +13204,9 @@
|
|||
}
|
||||
},
|
||||
"node_modules/pmtiles": {
|
||||
"version": "4.4.0",
|
||||
"resolved": "https://registry.npmjs.org/pmtiles/-/pmtiles-4.4.0.tgz",
|
||||
"integrity": "sha512-tCLI1C5134MR54i8izUWhse0QUtO/EC33n9yWp1N5dYLLvyc197U0fkF5gAJhq1TdWO9Tvl+9hgvFvM0fR27Zg==",
|
||||
"version": "4.3.2",
|
||||
"resolved": "https://registry.npmjs.org/pmtiles/-/pmtiles-4.3.2.tgz",
|
||||
"integrity": "sha512-Ath2F2U2E37QyNXjN1HOF+oLiNIbdrDYrk/K3C9K4Pgw2anwQX10y4WYWEH9O75vPiu0gBbSWIAbSG19svyvZg==",
|
||||
"license": "BSD-3-Clause",
|
||||
"dependencies": {
|
||||
"fflate": "^0.8.2"
|
||||
|
|
@ -13703,12 +13436,6 @@
|
|||
"node": "^18.17.0 || >=20.5.0"
|
||||
}
|
||||
},
|
||||
"node_modules/process-nextick-args": {
|
||||
"version": "2.0.1",
|
||||
"resolved": "https://registry.npmjs.org/process-nextick-args/-/process-nextick-args-2.0.1.tgz",
|
||||
"integrity": "sha512-3ouUOpQhtgrbOa17J7+uxOTpITYWaGP7/AhoR3+A+/1e9skrzelGi/dXzEYyvbxubEF6Wn2ypscTKiKJFFn1ag==",
|
||||
"license": "MIT"
|
||||
},
|
||||
"node_modules/process-warning": {
|
||||
"version": "5.0.0",
|
||||
"resolved": "https://registry.npmjs.org/process-warning/-/process-warning-5.0.0.tgz",
|
||||
|
|
@ -14674,12 +14401,6 @@
|
|||
"node": ">=0.10.0"
|
||||
}
|
||||
},
|
||||
"node_modules/setimmediate": {
|
||||
"version": "1.0.5",
|
||||
"resolved": "https://registry.npmjs.org/setimmediate/-/setimmediate-1.0.5.tgz",
|
||||
"integrity": "sha512-MATJdZp8sLqDl/68LfQmbP8zKPLQNV6BIZoIgrscFDQ+RsvK/BxeDQOgyxKKoh0y/8h3BqVFnCqQ/gd+reiIXA==",
|
||||
"license": "MIT"
|
||||
},
|
||||
"node_modules/setprototypeof": {
|
||||
"version": "1.2.0",
|
||||
"resolved": "https://registry.npmjs.org/setprototypeof/-/setprototypeof-1.2.0.tgz",
|
||||
|
|
@ -15427,9 +15148,9 @@
|
|||
}
|
||||
},
|
||||
"node_modules/strnum": {
|
||||
"version": "2.2.2",
|
||||
"resolved": "https://registry.npmjs.org/strnum/-/strnum-2.2.2.tgz",
|
||||
"integrity": "sha512-DnR90I+jtXNSTXWdwrEy9FakW7UX+qUZg28gj5fk2vxxl7uS/3bpI4fjFYVmdK9etptYBPNkpahuQnEwhwECqA==",
|
||||
"version": "2.1.2",
|
||||
"resolved": "https://registry.npmjs.org/strnum/-/strnum-2.1.2.tgz",
|
||||
"integrity": "sha512-l63NF9y/cLROq/yqKXSLtcMeeyOfnSQlfMSlzFt/K73oIaD8DGaQWd7Z34X9GPiKqP5rbSh84Hl4bOlLcjiSrQ==",
|
||||
"funding": [
|
||||
{
|
||||
"type": "github",
|
||||
|
|
@ -15566,9 +15287,9 @@
|
|||
}
|
||||
},
|
||||
"node_modules/tailwindcss": {
|
||||
"version": "4.2.2",
|
||||
"resolved": "https://registry.npmjs.org/tailwindcss/-/tailwindcss-4.2.2.tgz",
|
||||
"integrity": "sha512-KWBIxs1Xb6NoLdMVqhbhgwZf2PGBpPEiwOqgI4pFIYbNTfBXiKYyWoTsXgBQ9WFg/OlhnvHaY+AEpW7wSmFo2Q==",
|
||||
"version": "4.1.18",
|
||||
"resolved": "https://registry.npmjs.org/tailwindcss/-/tailwindcss-4.1.18.tgz",
|
||||
"integrity": "sha512-4+Z+0yiYyEtUVCScyfHCxOYP06L5Ne+JiHhY2IjR2KWMIWhJOYZKLSGZaP5HkZ8+bY0cxfzwDE5uOmzFXyIwxw==",
|
||||
"license": "MIT"
|
||||
},
|
||||
"node_modules/tapable": {
|
||||
|
|
@ -16748,9 +16469,9 @@
|
|||
"license": "ISC"
|
||||
},
|
||||
"node_modules/yaml": {
|
||||
"version": "2.8.3",
|
||||
"resolved": "https://registry.npmjs.org/yaml/-/yaml-2.8.3.tgz",
|
||||
"integrity": "sha512-AvbaCLOO2Otw/lW5bmh9d/WEdcDFdQp2Z2ZUH3pX9U2ihyUY0nvLv7J6TrWowklRGPYbB/IuIMfYgxaCPg5Bpg==",
|
||||
"version": "2.8.2",
|
||||
"resolved": "https://registry.npmjs.org/yaml/-/yaml-2.8.2.tgz",
|
||||
"integrity": "sha512-mplynKqc1C2hTVYxd0PU2xQAc22TI1vShAYGksCCfxbn/dFwnHTNi1bvYsBTkhdUNtGIf5xNOg938rrSSYvS9A==",
|
||||
"license": "ISC",
|
||||
"bin": {
|
||||
"yaml": "bin.mjs"
|
||||
|
|
|
|||
|
|
@ -47,8 +47,7 @@
|
|||
"@japa/runner": "^4.2.0",
|
||||
"@swc/core": "1.11.24",
|
||||
"@tanstack/eslint-plugin-query": "^5.81.2",
|
||||
"@types/compression": "^1.8.1",
|
||||
"@types/dockerode": "^4.0.1",
|
||||
"@types/dockerode": "^3.3.41",
|
||||
"@types/luxon": "^3.6.2",
|
||||
"@types/node": "^22.15.18",
|
||||
"@types/react": "^19.1.8",
|
||||
|
|
@ -95,21 +94,18 @@
|
|||
"better-sqlite3": "^12.1.1",
|
||||
"bullmq": "^5.65.1",
|
||||
"cheerio": "^1.2.0",
|
||||
"compression": "^1.8.1",
|
||||
"dockerode": "^4.0.7",
|
||||
"edge.js": "^6.2.1",
|
||||
"fast-xml-parser": "^5.5.7",
|
||||
"fast-xml-parser": "^5.5.6",
|
||||
"fuse.js": "^7.1.0",
|
||||
"jszip": "^3.10.1",
|
||||
"luxon": "^3.6.1",
|
||||
"maplibre-gl": "^4.7.1",
|
||||
"mysql2": "^3.14.1",
|
||||
"ollama": "^0.6.3",
|
||||
"openai": "^6.27.0",
|
||||
"pdf-parse": "^2.4.5",
|
||||
"pdf2pic": "^3.2.0",
|
||||
"pino-pretty": "^13.0.0",
|
||||
"pmtiles": "^4.4.0",
|
||||
"pmtiles": "^4.3.0",
|
||||
"postcss": "^8.5.6",
|
||||
"react": "^19.1.0",
|
||||
"react-adonis-transmit": "^1.0.1",
|
||||
|
|
@ -121,11 +117,11 @@
|
|||
"sharp": "^0.34.5",
|
||||
"stopword": "^3.1.5",
|
||||
"systeminformation": "^5.31.0",
|
||||
"tailwindcss": "^4.2.1",
|
||||
"tailwindcss": "^4.1.10",
|
||||
"tar": "^7.5.11",
|
||||
"tesseract.js": "^7.0.0",
|
||||
"url-join": "^5.0.0",
|
||||
"yaml": "^2.8.3"
|
||||
"yaml": "^2.8.0"
|
||||
},
|
||||
"hotHook": {
|
||||
"boundaries": [
|
||||
|
|
|
|||
|
|
@ -1,53 +0,0 @@
|
|||
import logger from '@adonisjs/core/services/logger'
|
||||
import type { ApplicationService } from '@adonisjs/core/types'
|
||||
|
||||
/**
|
||||
* Checks whether the installed kiwix container is still using the legacy glob-pattern
|
||||
* command (`*.zim --address=all`) and, if so, migrates it to library mode
|
||||
* (`--library /data/kiwix-library.xml --monitorLibrary --address=all`) automatically.
|
||||
*
|
||||
* This provider runs once on every admin startup. After migration the check is a no-op
|
||||
* (inspects the container and finds the new command).
|
||||
*/
|
||||
export default class KiwixMigrationProvider {
|
||||
constructor(protected app: ApplicationService) {}
|
||||
|
||||
async boot() {
|
||||
// Only run in the web (HTTP server) environment — skip for ace commands and tests
|
||||
if (this.app.getEnvironment() !== 'web') return
|
||||
|
||||
// Defer past synchronous boot so DB connections and all providers are fully ready
|
||||
setImmediate(async () => {
|
||||
try {
|
||||
const Service = (await import('#models/service')).default
|
||||
const { SERVICE_NAMES } = await import('../constants/service_names.js')
|
||||
const { DockerService } = await import('#services/docker_service')
|
||||
|
||||
const kiwixService = await Service.query()
|
||||
.where('service_name', SERVICE_NAMES.KIWIX)
|
||||
.first()
|
||||
|
||||
if (!kiwixService?.installed) {
|
||||
logger.info('[KiwixMigrationProvider] Kiwix not installed — skipping migration check.')
|
||||
return
|
||||
}
|
||||
|
||||
const dockerService = new DockerService()
|
||||
const isLegacy = await dockerService.isKiwixOnLegacyConfig()
|
||||
|
||||
if (!isLegacy) {
|
||||
logger.info('[KiwixMigrationProvider] Kiwix is already in library mode — no migration needed.')
|
||||
return
|
||||
}
|
||||
|
||||
logger.info('[KiwixMigrationProvider] Kiwix on legacy config — running automatic migration to library mode.')
|
||||
await dockerService.migrateKiwixToLibraryMode()
|
||||
logger.info('[KiwixMigrationProvider] Startup migration complete.')
|
||||
} catch (err: any) {
|
||||
logger.error(`[KiwixMigrationProvider] Startup migration failed: ${err.message}`)
|
||||
// Non-fatal: the next affectContainer('restart') call will retry via the
|
||||
// intercept in DockerService.affectContainer().
|
||||
}
|
||||
})
|
||||
}
|
||||
}
|
||||
BIN
admin/public/docs/ai-chat.png
Normal file
|
After Width: | Height: | Size: 83 KiB |
|
Before Width: | Height: | Size: 15 KiB |
BIN
admin/public/docs/benchmark.png
Normal file
|
After Width: | Height: | Size: 140 KiB |
|
Before Width: | Height: | Size: 23 KiB |
BIN
admin/public/docs/content-explorer.png
Normal file
|
After Width: | Height: | Size: 163 KiB |
|
Before Width: | Height: | Size: 31 KiB |
BIN
admin/public/docs/dashboard.png
Normal file
|
After Width: | Height: | Size: 139 KiB |
|
Before Width: | Height: | Size: 27 KiB |
BIN
admin/public/docs/easy-setup-step1.png
Normal file
|
After Width: | Height: | Size: 214 KiB |
|
Before Width: | Height: | Size: 40 KiB |
BIN
admin/public/docs/easy-setup-tiers.png
Normal file
|
After Width: | Height: | Size: 241 KiB |
|
Before Width: | Height: | Size: 45 KiB |
BIN
admin/public/docs/knowledge-base.png
Normal file
|
After Width: | Height: | Size: 83 KiB |
|
Before Width: | Height: | Size: 17 KiB |
BIN
admin/public/docs/maps.png
Normal file
|
After Width: | Height: | Size: 400 KiB |
|
Before Width: | Height: | Size: 33 KiB |
BIN
admin/public/powered_by_crosstalk.png
Normal file
|
After Width: | Height: | Size: 17 KiB |
|
Before Width: | Height: | Size: 4.1 KiB |
BIN
admin/public/project_nomad_logo.png
Normal file
|
After Width: | Height: | Size: 952 KiB |
|
Before Width: | Height: | Size: 51 KiB |
BIN
admin/public/rogue-support-banner.png
Normal file
|
After Width: | Height: | Size: 251 KiB |
|
Before Width: | Height: | Size: 48 KiB |
|
|
@ -19,7 +19,6 @@ export default await Env.create(new URL('../', import.meta.url), {
|
|||
URL: Env.schema.string(),
|
||||
LOG_LEVEL: Env.schema.string(),
|
||||
INTERNET_STATUS_TEST_URL: Env.schema.string.optional(),
|
||||
DISABLE_COMPRESSION: Env.schema.boolean.optional(),
|
||||
|
||||
/*
|
||||
|----------------------------------------------------------
|
||||
|
|
|
|||
|
|
@ -39,7 +39,6 @@ router.use([
|
|||
() => import('@adonisjs/core/bodyparser_middleware'),
|
||||
// () => import('@adonisjs/session/session_middleware'),
|
||||
() => import('@adonisjs/shield/shield_middleware'),
|
||||
() => import('#middleware/compression_middleware'),
|
||||
])
|
||||
|
||||
/**
|
||||
|
|
|
|||
|
|
@ -78,12 +78,6 @@ router
|
|||
router.post('/download-remote', [MapsController, 'downloadRemote'])
|
||||
router.post('/download-remote-preflight', [MapsController, 'downloadRemotePreflight'])
|
||||
router.post('/download-collection', [MapsController, 'downloadCollection'])
|
||||
router.get('/global-map-info', [MapsController, 'globalMapInfo'])
|
||||
router.post('/download-global-map', [MapsController, 'downloadGlobalMap'])
|
||||
router.get('/markers', [MapsController, 'listMarkers'])
|
||||
router.post('/markers', [MapsController, 'createMarker'])
|
||||
router.patch('/markers/:id', [MapsController, 'updateMarker'])
|
||||
router.delete('/markers/:id', [MapsController, 'deleteMarker'])
|
||||
router.delete('/:filename', [MapsController, 'delete'])
|
||||
})
|
||||
.prefix('/api/maps')
|
||||
|
|
@ -99,7 +93,6 @@ router
|
|||
router.get('/jobs', [DownloadsController, 'index'])
|
||||
router.get('/jobs/:filetype', [DownloadsController, 'filetype'])
|
||||
router.delete('/jobs/:jobId', [DownloadsController, 'removeJob'])
|
||||
router.post('/jobs/:jobId/cancel', [DownloadsController, 'cancelJob'])
|
||||
})
|
||||
.prefix('/api/downloads')
|
||||
|
||||
|
|
@ -114,8 +107,6 @@ router
|
|||
router.post('/models', [OllamaController, 'dispatchModelDownload'])
|
||||
router.delete('/models', [OllamaController, 'deleteModel'])
|
||||
router.get('/installed-models', [OllamaController, 'installedModels'])
|
||||
router.post('/configure-remote', [OllamaController, 'configureRemote'])
|
||||
router.get('/remote-status', [OllamaController, 'remoteStatus'])
|
||||
})
|
||||
.prefix('/api/ollama')
|
||||
|
||||
|
|
@ -139,8 +130,6 @@ router
|
|||
router.get('/files', [RagController, 'getStoredFiles'])
|
||||
router.delete('/files', [RagController, 'deleteFile'])
|
||||
router.get('/active-jobs', [RagController, 'getActiveJobs'])
|
||||
router.get('/failed-jobs', [RagController, 'getFailedJobs'])
|
||||
router.delete('/failed-jobs', [RagController, 'cleanupFailedJobs'])
|
||||
router.get('/job-status', [RagController, 'getJobStatus'])
|
||||
router.post('/sync', [RagController, 'scanAndSync'])
|
||||
})
|
||||
|
|
|
|||
|
|
@ -23,20 +23,11 @@ export type DoResumableDownloadProgress = {
|
|||
url: string
|
||||
}
|
||||
|
||||
export type DownloadProgressData = {
|
||||
percent: number
|
||||
downloadedBytes: number
|
||||
totalBytes: number
|
||||
lastProgressTime: number
|
||||
}
|
||||
|
||||
export type RunDownloadJobParams = Omit<
|
||||
DoResumableDownloadParams,
|
||||
'onProgress' | 'onComplete' | 'signal'
|
||||
> & {
|
||||
filetype: string
|
||||
title?: string
|
||||
totalBytes?: number
|
||||
resourceMetadata?: {
|
||||
resource_id: string
|
||||
version: string
|
||||
|
|
@ -50,11 +41,7 @@ export type DownloadJobWithProgress = {
|
|||
progress: number
|
||||
filepath: string
|
||||
filetype: string
|
||||
title?: string
|
||||
downloadedBytes?: number
|
||||
totalBytes?: number
|
||||
lastProgressTime?: number
|
||||
status?: 'active' | 'waiting' | 'delayed' | 'failed'
|
||||
status?: 'active' | 'failed'
|
||||
failedReason?: string
|
||||
}
|
||||
|
||||
|
|
|
|||
|
|
@ -10,8 +10,6 @@ export const KV_STORE_SCHEMA = {
|
|||
'ui.theme': 'string',
|
||||
'ai.assistantCustomName': 'string',
|
||||
'gpu.type': 'string',
|
||||
'ai.remoteOllamaUrl': 'string',
|
||||
'ai.ollamaFlashAttention': 'boolean',
|
||||
} as const
|
||||
|
||||
type KVTagToType<T extends string> = T extends 'boolean' ? boolean : string
|
||||
|
|
|
|||
|
|
@ -44,16 +44,3 @@ export type OllamaChatResponse = {
|
|||
}
|
||||
done: boolean
|
||||
}
|
||||
|
||||
export type NomadInstalledModel = {
|
||||
name: string
|
||||
size: number
|
||||
digest?: string
|
||||
details?: Record<string, any>
|
||||
}
|
||||
|
||||
export type NomadChatResponse = {
|
||||
message: { content: string; thinking?: string }
|
||||
done: boolean
|
||||
model: string
|
||||
}
|
||||
|
|
|
|||