- Models hosted on Replicate can now be accessed using the llm-replicate plugin, including the new Llama 2 model from Meta AI. More details here: Accessing Llama 2 from the command-line with the llm-replicate plugin.
- Model providers that expose an API that is compatible with the OpenAPI API format, including self-hosted model servers such as LocalAI, can now be accessed using additional configuration for the default OpenAI plugin. #106
- OpenAI models that are not yet supported by LLM can also be configured } using the new extra-openai-models.yaml` configuration file. #107
- The llm logs command now accepts a
-m model_idoption to filter logs to a specific model. Aliases can be used here in addition to model IDs. #108 - Logs now have a SQLite full-text search index against their prompts and responses, and the
llm logs -q SEARCHoption can be used to return logs that match a search term. #109