github bentoml/OpenLLM v0.2.25

latest releases: v0.6.30, v0.6.29, v0.6.28...
2 years ago

What changed?

The long-awaited ClojureScript UI is now GA. Try it out with docker run -p 8420:80 ghcr.io/bentoml/openllm-ui-clojure:0.2.25. Thanks @GutZuFusss

Added vLLM support for Falcon, and general CQA.

Installation

pip install openllm==0.2.25

To upgrade from a previous version, use the following command:

pip install --upgrade openllm==0.2.25

Usage

All available models: openllm models

To start a LLM: python -m openllm start opt

To run OpenLLM within a container environment (requires GPUs): docker run --gpus all -it -P ghcr.io/bentoml/openllm:0.2.25 start opt

To run OpenLLM Clojure UI (community-maintained): docker run -p 8420:80 ghcr.io/bentoml/openllm-ui-clojure:0.2.25

Find more information about this release in the CHANGELOG.md

What's Changed

  • chore: upload nightly wheels to test.pypi.org by @aarnphm in #215
  • feat(contrib): ClojureScript UI by @GutZuFusss in #89
  • fix(ci): remove broken build hooks by @aarnphm in #216
  • chore(ci): add dependabot and fix vllm release container by @aarnphm in #217
  • feat(models): add vLLM support for Falcon by @aarnphm in #223
  • chore(readme): update nightly badge [skip ci] by @aarnphm in #224

New Contributors

Full Changelog: v0.2.24...v0.2.25

Don't miss a new OpenLLM release

NewReleases is sending notifications on new releases.