mirror of
https://github.com/Crosstalk-Solutions/project-nomad.git
synced 2026-03-29 04:59:26 +02:00
- Add MiniMaxService with OpenAI-compatible API integration - Route MiniMax models (MiniMax-M2.7, MiniMax-M2.7-highspeed) through existing chat controller alongside Ollama - Cloud models appear in model selector when MINIMAX_API_KEY is set - Add MINIMAX_API_KEY env var support - Add 8 unit tests + 3 integration tests - Update README with MiniMax mention
21 lines
636 B
Plaintext
21 lines
636 B
Plaintext
PORT=8080
|
|
HOST=localhost
|
|
LOG_LEVEL=info
|
|
APP_KEY=some_random_key
|
|
NODE_ENV=development
|
|
SESSION_DRIVER=cookie
|
|
DB_HOST=localhost
|
|
DB_PORT=3306
|
|
DB_USER=root
|
|
DB_DATABASE=nomad
|
|
DB_PASSWORD=password
|
|
DB_SSL=false
|
|
REDIS_HOST=localhost
|
|
REDIS_PORT=6379
|
|
# Storage path for NOMAD content (ZIM files, maps, etc.)
|
|
# On Windows dev, use an absolute path like: C:/nomad-storage
|
|
# On Linux production, use: /opt/project-nomad/storage
|
|
NOMAD_STORAGE_PATH=/opt/project-nomad/storage
|
|
# Optional: MiniMax cloud LLM API key (enables cloud models alongside local Ollama models)
|
|
# Get your API key at https://platform.minimax.io
|
|
# MINIMAX_API_KEY=your_api_key_here |