project-nomad/admin/.env.example
PR Bot ae856b268e feat: add MiniMax as optional cloud LLM provider
- Add MiniMaxService with OpenAI-compatible API integration
- Route MiniMax models (MiniMax-M2.7, MiniMax-M2.7-highspeed) through
  existing chat controller alongside Ollama
- Cloud models appear in model selector when MINIMAX_API_KEY is set
- Add MINIMAX_API_KEY env var support
- Add 8 unit tests + 3 integration tests
- Update README with MiniMax mention
2026-03-26 12:29:14 +08:00

21 lines
636 B
Plaintext

PORT=8080
HOST=localhost
LOG_LEVEL=info
APP_KEY=some_random_key
NODE_ENV=development
SESSION_DRIVER=cookie
DB_HOST=localhost
DB_PORT=3306
DB_USER=root
DB_DATABASE=nomad
DB_PASSWORD=password
DB_SSL=false
REDIS_HOST=localhost
REDIS_PORT=6379
# Storage path for NOMAD content (ZIM files, maps, etc.)
# On Windows dev, use an absolute path like: C:/nomad-storage
# On Linux production, use: /opt/project-nomad/storage
NOMAD_STORAGE_PATH=/opt/project-nomad/storage
# Optional: MiniMax cloud LLM API key (enables cloud models alongside local Ollama models)
# Get your API key at https://platform.minimax.io
# MINIMAX_API_KEY=your_api_key_here