github simonw/llm 0.26a0

latest releases: 0.28, 0.27.1, 0.27...
pre-release7 months ago

This is the first alpha to introduce support for tools! Models with tool capability (which includes the default OpenAI model family) can now be granted access to execute Python functions as part of responding to a prompt.

Tools are supported by the command-line interface:

llm  --functions '
def multiply(x: int, y: int) -> int:
 """Multiply two numbers."""
 return x * y
' 'what is 34234 * 213345'

And in the Python API, using a new model.chain() method for executing multiple prompts in a sequence:

import llm

def multiply(x: int, y: int) -> int:
  """Multiply two numbers."""
    return x * y

model = llm.get_model("gpt-4.1-mini")
response = model.chain(
    "What is 34234 * 213345?",
    tools=[multiply]
)
print(response.text())

New tools can also be defined using the register_tools() plugin hook. They can then be called by name from the command-line like this:

llm  -T  multiply  'What is 34234 * 213345?'

Tool support is currently under active development. Consult this milestone for the latest status.

Don't miss a new llm release

NewReleases is sending notifications on new releases.