github bentoml/OpenLLM v0.2.27

latest releases: v0.6.30, v0.6.29, v0.6.28...
2 years ago

Installation

pip install openllm==0.2.27

To upgrade from a previous version, use the following command:

pip install --upgrade openllm==0.2.27

Usage

All available models: openllm models

To start a LLM: python -m openllm start opt

To run OpenLLM within a container environment (requires GPUs): docker run --gpus all -it -P ghcr.io/bentoml/openllm:0.2.27 start opt

To run OpenLLM Clojure UI (community-maintained): docker run -p 8420:80 ghcr.io/bentoml/openllm-ui-clojure:0.2.27

Find more information about this release in the CHANGELOG.md

What's Changed

  • feat: token streaming and SSE support by @aarnphm in #240
  • chore(deps): bump @mui/x-data-grid from 6.11.1 to 6.11.2 by @dependabot in #242
  • chore(deps): bump peter-evans/create-pull-request from 4.2.4 to 5.0.2 by @dependabot in #244
  • chore(deps): bump taiki-e/install-action from 2.15.4 to 2.16.0 by @dependabot in #245
  • chore(deps): bump @mui/x-date-pickers from 6.0.0 to 6.11.2 by @dependabot in #243
  • refactor: packages by @aarnphm in #249
  • ci: pre-commit autoupdate [pre-commit.ci] by @pre-commit-ci in #246
  • feat(embeddings): Using self-hosted CPU EC2 runner by @aarnphm in #250
  • refactor(contrib): similar namespace [clojure-ui build] by @aarnphm in #251
  • chore: ignore peft and fix adapter loading issue by @aarnphm in #255

Full Changelog: v0.2.26...v0.2.27

Don't miss a new OpenLLM release

NewReleases is sending notifications on new releases.