github av/harbor v0.1.0

latest releases: v0.4.5, v0.4.4, v0.4.3...
19 months ago

image

Harbor is a toolkit that can help you running LLMs and related projects on your hardware. Harbor is a CLI that manages a set of pre-configured Docker Compose configurations.

Services

UIs

Open WebUI ⦁︎ LibreChat ⦁︎ Hollama ⦁︎ parllama, BionicGPT

Backends

Ollama ⦁︎ llama.cpp ⦁︎ vLLM ⦁︎ TabbyAPI ⦁︎ Aphrodite Engine ⦁︎ mistral.rs ⦁︎ openedai-speech, text-generation-inference ⦁︎ LMDeploy

Satellites

SearXNG ⦁︎ Dify ⦁︎ Plandex ⦁︎ LiteLLM ⦁︎ LangFuse ⦁︎ Open Interpreter ⦁︎ cloudflared ⦁︎ cmdh

CLI

# Start default services
harbor up

# Start more services that are configured to work together
harbor up searxng tts

# Run additional/alternative LLM Inference backends
# Open Webui is automatically connected to them.
harbor up llamacpp tgi litellm vllm tabbyapi aphrodite mistralrs

# Run different Frontends
harbor up librechat bionicgpt hollama

# Shortcut to HF Hub to find the models
harbor hf find gguf gemma-2
# Use HFDownloader and official HF CLI to download models
harbor hf dl -m google/gemma-2-2b-it -c 10 -s ./hf
harbor hf download google/gemma-2-2b-it

# Where possible, cache is shared between the services
harbor tgi model google/gemma-2-2b-it
harbor vllm model google/gemma-2-2b-it
harbor aphrodite model google/gemma-2-2b-it
harbor tabbyapi model google/gemma-2-2b-it-exl2
harbor mistralrs model google/gemma-2-2b-it
harbor opint model google/gemma-2-2b-it

# Convenience tools for docker setup
harbor shell vllm
harbor exec webui curl $(harbor url -i ollama)

# Tell your shell exactly what you think about it
harbor opint # Open Interpreter CLI in current folder

# Access service CLIs without installing them
harbor hf scan-cache
harbor ollama list

# Open services from the CLI
harbor open webui
harbor open llamacpp
# Print yourself a QR to quickly open the
# service on your phone
harbor qr
# Feeling adventurous? Expose your harbor
# to the internet
harbor tunnel

# Config management
harbor config list
harbor config set webui.host.port 8080

# Eject from Harbor into a standalone Docker Compose setup
# Will export related services and variables into a standalone file.
harbor eject searxng llamacpp > docker-compose.harbor.yml

# Gimmick/Fun Area

# Argument scrambling, below commands are all the same as above
# Harbor doesn't care if it's "vllm model" or "model vllm", it'll
# figure it out.
harbor vllm model            # harbor model vllm
harbor config get webui.name # harbor get config webui_name
harbor tabbyapi shell        # harbor shell tabbyapi

# 50% gimmick, 50% useful
# Ask harbor about itself, note no quotes
harbor how to ping ollama container from the webui?

Documentation

  • Harbor CLI Reference

    Read more about Harbor CLI commands and options.
  • Harbor Services

    Read about supported services and the ways to configure them.
  • Harbor Compose Setup

    Read about the way Harbor uses Docker Compose to manage services.
  • Compatibility

    Known compatibility issues between the services and models as well as possible workarounds.

v0.1.0 changes

  • Argument scrambling support
  • OpenAI proxy to for Dify to integrate with the rest of the services
  • main is now consitered (somewhat) stable, will no long be updated with bleeding edge changes

Full Changelog: v0.0.21...v0.1.0

Don't miss a new harbor release

NewReleases is sending notifications on new releases.