github bentoml/OpenLLM v0.1.7

latest releases: v0.6.30, v0.6.29, v0.6.28...
2 years ago

Features

OpenLLM now seamlessly integrates with HuggingFace Agents. Replace the HfAgent endpoint with a running remote server.

import transformers

agent = transformers.HfAgent("http://localhost:3000/hf/agent")  # URL that runs the OpenLLM server

agent.run("Is the following `text` positive or negative?", text="I don't like how this models is generate inputs")

Note
only starcoder is currently supported for agent feature.

To use it from the openllm.client, do:

import openllm

client = openllm.client.HTTPClient("http://123.23.21.1:3000/")

client.ask_agent(
    task="Is the following `text` positive or negative?",
    text="What are you thinking about?",
)

Installation

pip install openllm==0.1.7

To upgrade from a previous version, use the following command:

pip install --upgrade openllm==0.1.7

Usage

All available models: python -m openllm.models

To start a LLM: python -m openllm start dolly-v2

Find more information about this release in the CHANGELOG.md

What's Changed

Full Changelog: v0.1.6...v0.1.7

Don't miss a new OpenLLM release

NewReleases is sending notifications on new releases.