What's Changed
LiteLLM Proxy now maps exceptions for 100+ LLMs to the OpenAI format https://docs.litellm.ai/docs/proxy/quick_start
๐งจ Log all LLM Input/Output to @dynamodb set litellm.success_callback = ["dynamodb"]
https://docs.litellm.ai/docs/proxy/logging#logging-proxy-inputoutput---dynamodb
โญ๏ธ Support for @MistralAI API, Gemini PRO
๐ Set Aliases for model groups on LiteLLM Proxy
๐ Exception mapping for openai.NotFoundError live now + testing for exception mapping on proxy added to LiteLLM ci/cd https://docs.litellm.ai/docs/exception_mapping
โ๏ธ Fixes for async + streaming caching https://docs.litellm.ai/docs/proxy/caching
๐ Support for using Async logging with @langfuse live on proxy
AI Generated Release Notes
- Enable setting default
model
value forLiteLLM
,Chat
,Completions
by @estill01 in #985 - fix replicate system prompt: forgot to add **optional_params to input data by @nbaldwin98 in #1080
- Update factory.py to fix issue when calling from write-the -> langchain -> litellm served ollama by @James4Ever0 in #1054
- Update Dockerfile to preinstall Prisma CLI by @Manouchehri in #1039
- build(deps): bump aiohttp from 3.8.6 to 3.9.0 by @dependabot in #937
- multistage docker build by @wallies in #995
- fix: traceloop links by @nirga in #1123
- refactor: add CustomStreamWrapper return type for completion by @Undertone0809 in #1112
- fix langfuse tests by @maxdeichmann in #1097
- Fix #1119, no content when streaming. by @emsi in #1122
- docs(projects): add Docq to 'projects built on..' section by @janaka in #1142
- docs(projects): add Docq.AI to sidebar nav by @janaka in #1143
New Contributors
- @James4Ever0 made their first contribution in #1054
- @wallies made their first contribution in #995
- @maxdeichmann made their first contribution in #1097
- @emsi made their first contribution in #1122
- @janaka made their first contribution in #1142
Full Changelog: v1.11.1...v1.15.0