See also LLM 0.13: The annotated release notes.
- Added support for new OpenAI embedding models:
3-smalland3-largeand three variants of those with different dimension sizes,3-small-512,3-large-256and3-large-1024. See OpenAI embedding models for details. #394 - The default
gpt-4-turbomodel alias now points togpt-4-turbo-preview, which uses the most recent OpenAI GPT-4 turbo model (currentlygpt-4-0125-preview). #396 - New OpenAI model aliases
gpt-4-1106-previewandgpt-4-0125-preview. - OpenAI models now support a
-o json_object 1option which will cause their output to be returned as a valid JSON object. #373 - New plugins since the last release include llm-mistral, llm-gemini, llm-ollama and llm-bedrock-meta.
- The
keys.jsonfile for storing API keys is now created with600file permissions. #351 - Documented a pattern for installing plugins that depend on PyTorch using the Homebrew version of LLM, despite Homebrew using Python 3.12 when PyTorch have not yet released a stable package for that Python version. #397
- Underlying OpenAI Python library has been upgraded to
>1.0. It is possible this could cause compatibility issues with LLM plugins that also depend on that library. #325 - Arrow keys now work inside the
llm chatcommand. #376 LLM_OPENAI_SHOW_RESPONSES=1environment variable now outputs much more detailed information about the HTTP request and response made to OpenAI (and OpenAI-compatible) APIs. #404- Dropped support for Python 3.7.