github BerriAI/litellm v1.15.0

latest releases: v1.48.17-stable, v1.48.17, v1.48.16-stable...
9 months ago

What's Changed

LiteLLM Proxy now maps exceptions for 100+ LLMs to the OpenAI format https://docs.litellm.ai/docs/proxy/quick_start
๐Ÿงจ Log all LLM Input/Output to @dynamodb set litellm.success_callback = ["dynamodb"] https://docs.litellm.ai/docs/proxy/logging#logging-proxy-inputoutput---dynamodb
โญ๏ธ Support for @MistralAI API, Gemini PRO
๐Ÿ”Ž Set Aliases for model groups on LiteLLM Proxy
๐Ÿ”Ž Exception mapping for openai.NotFoundError live now + testing for exception mapping on proxy added to LiteLLM ci/cd https://docs.litellm.ai/docs/exception_mapping
โš™๏ธ Fixes for async + streaming caching https://docs.litellm.ai/docs/proxy/caching
๐Ÿ‘‰ Support for using Async logging with @langfuse live on proxy

AI Generated Release Notes

New Contributors

Full Changelog: v1.11.1...v1.15.0

Don't miss a new litellm release

NewReleases is sending notifications on new releases.