github letta-ai/letta 0.5.0
v0.5.0

latest releases: 0.16.1, 0.16.0, 0.15.1...
15 months ago

This release introduces major changes to how model providers are configured with Letta, as well as many bugfixes.

๐Ÿงฐ Dynamic model listing and multiple providers (#1814)

Model providers (e.g. OpenAI, Ollama, vLLM, etc.) are now enabled using environment variables, where multiple providers can be enabled at a time. When a provider is enabled, all supported LLM and embedding models will be listed as options to be selected in the CLI and ADE in a dropdown.

For example for OpenAI, you can simply get started with:

> export OPENAI_API_KEY=...
> letta run 
   ? Select LLM model: (Use arrow keys)
   ยป letta-free [type=openai] [ip=https://inference.memgpt.ai]
      gpt-4o-mini-2024-07-18 [type=openai] [ip=https://api.openai.com/v1]
      gpt-4o-mini [type=openai] [ip=https://api.openai.com/v1]
      gpt-4o-2024-08-06 [type=openai] [ip=https://api.openai.com/v1]
      gpt-4o-2024-05-13 [type=openai] [ip=https://api.openai.com/v1]
      gpt-4o [type=openai] [ip=https://api.openai.com/v1]
      gpt-4-turbo-preview [type=openai] [ip=https://api.openai.com/v1]
      gpt-4-turbo-2024-04-09 [type=openai] [ip=https://api.openai.com/v1]
      gpt-4-turbo [type=openai] [ip=https://api.openai.com/v1]
      gpt-4-1106-preview [type=openai] [ip=https://api.openai.com/v1]
      gpt-4-0613 [type=openai] [ip=https://api.openai.com/v1]
     ... 

Similarly, if you are using the ADE with letta server, you can select the model to use from the model dropdown.

# include models from OpenAI 
> export OPENAI_API_KEY=...

# include models from Anthropic 
> export ANTHROPIC_API_KEY=... 

# include models served by Ollama 
> export OLLAMA_BASE_URL=...

> letta server

We are deprecating the letta configure and letta quickstart commands, and the the use of ~/.letta/config for specifying the default LLMConfig and EmbeddingConfig, as it prevents a single letta server from being able to run agents with different model configurations concurrently, or to change the model configuration of an agent without re-starting the server. This workflow also required users to specify the model name, provider, and context window size manually via letta configure.

๐Ÿง  Integration testing for model providers

We added integration tests (including testing of MemGPT memory management tool-use) for the following model providers, and fixed many bugs in the process:

๐Ÿ“Š Database migrations

We now support automated database migrations via alembic, implemented in #1867. You can expect future release to support automated migrations even if there are schema changes.

What's Changed

New Contributors

Full Changelog: 0.4.1...0.5.0

Don't miss a new letta release

NewReleases is sending notifications on new releases.