OptiLLM
optillm is an optimising LLM proxy, similar to Harbor Boost with a lot of advanced reasoning/planning workflows.
# Will build and start the service
# [--tail] is optional to automatically follow service logs after start
harbor up optillm --tailoptillm is connected to all inference backends in Harbor out of the box (but haven't been tested). See compatibility guide on making it work with Open WebUI.
Misc
langfusewas updated to v3