github bentoml/OpenLLM v0.1.1

latest releases: v0.6.30, v0.6.29, v0.6.28...
2 years ago

🎉 Hello world, OpenLLM

OpenLLM version 0.1.1 brings initial support for SOTA LLMs (more to come!!):

Model CPU GPU Installation Model Ids
flan-t5
pip install "openllm[flan-t5]"
  • google/flan-t5-small
  • google/flan-t5-base
  • google/flan-t5-large
  • google/flan-t5-xl
  • google/flan-t5-xxl
  • dolly-v2
    pip install openllm
  • databricks/dolly-v2-3b
  • databricks/dolly-v2-7b
  • databricks/dolly-v2-12b
  • chatglm
    pip install "openllm[chatglm]"
  • thudm/chatglm-6b
  • thudm/chatglm-6b-int8
  • thudm/chatglm-6b-int4
  • starcoder
    pip install "openllm[starcoder]"
  • bigcode/starcoder
  • bigcode/starcoderbase
  • falcon
    pip install "openllm[falcon]"
  • tiiuae/falcon-7b
  • tiiuae/falcon-40b
  • tiiuae/falcon-7b-instruct
  • tiiuae/falcon-40b-instruct
  • stablelm
    pip install openllm
  • stabilityai/stablelm-tuned-alpha-3b
  • stabilityai/stablelm-tuned-alpha-7b
  • stabilityai/stablelm-base-alpha-3b
  • stabilityai/stablelm-base-alpha-7b
  • Quickly startup falcon locally, with openllm start falcon:

    openllm start falcon

    Easily bundle this LLM into Bento, a portable format that can be easily deployed everywhere:

    openllm build falcon

    Refers to the README.md for more details

    Installation

    pip install openllm==0.1.1

    To upgrade from a previous version, use the following command:

    pip install --upgrade openllm==0.1.1

    Usage

    All available models: python -m openllm.models

    To start a LLM: python -m openllm start dolly-v2

    Full Changelog: v0.1.0...v0.1.1

    Don't miss a new OpenLLM release

    NewReleases is sending notifications on new releases.