Open Terminal
Lightweight remote shell and file-management API for AI agents — works with Open WebUI out of the box.
harbor up openterminalharbor models
You can now manage your llama.cpp, HuggingFace and Ollama models in a single concise CLI.
$ ▼ harbor models ls
SOURCE MODEL SIZE DETAILS
ollama qwen3.5:35b 23.9 GB qwen35moe 36.0B Q4_K_M
hf hexgrad/Kokoro-82M 358 MB
hf Systran/faster-distil-whisper-large-v3 1.5 GB
llamacpp unsloth/Qwen3-Next-80B-A3B-Instruct-GGUF:Q4_0 45.3 GB Q4_0
# Use programmatically with jq and other tools
harbor models ls --json
# Pull Ollama models or HF repos
harbor models pull qwen3:8b
harbor models pull bartowski/Llama-3.2-1B-Instruct-GGUF
# Use same ID you can see in `ls` for removing the models
harbor models rm qwen3:8bMisc
- Add integration tests and mock OpenAI service for CI.
- Fix workspace paths in default.env to point to services directory.
- Fix JSON merger output path in Open WebUI start script.
- Fabric docs updates and CLI fixes.
Full Changelog: v0.4.3...v0.4.4