github bentoml/OpenLLM v0.4.26

latest releases: v0.6.30, v0.6.29, v0.6.28...
2 years ago

Installation

pip install openllm==0.4.26

To upgrade from a previous version, use the following command:

pip install --upgrade openllm==0.4.26

Usage

All available models: openllm models

To start a LLM: python -m openllm start HuggingFaceH4/zephyr-7b-beta

To run OpenLLM within a container environment (requires GPUs): docker run --gpus all -it -P -v $PWD/data:$HOME/.cache/huggingface/ ghcr.io/bentoml/openllm:0.4.26 start HuggingFaceH4/zephyr-7b-beta

Find more information about this release in the CHANGELOG.md

What's Changed

  • fix(infra): setup higher timer for building container images by @aarnphm in #723
  • fix(client): correct schemas parser from correct response output by @aarnphm in #724
  • feat(openai): chat templates and complete control of prompt generation by @aarnphm in #725

Full Changelog: v0.4.25...v0.4.26

Don't miss a new OpenLLM release

NewReleases is sending notifications on new releases.