What's Changed
- Fix redis cluster mode for routers by @ogunoz in #9010
- [Feat] - Display
thinking
tokens on OpenWebUI (Bedrock, Anthropic, Deepseek) by @ishaan-jaff in #9029 - (AWS Secret Manager) - Using K/V pairs in 1 AWS Secret by @ishaan-jaff in #9039
- (Docs) connect litellm to open web ui by @ishaan-jaff in #9040
- Added PDL project by @vazirim in #8925
- (UI) - Allow adding EU OpenAI models by @ishaan-jaff in #9042
- fix(team_endpoints.py): ensure 404 raised when team not found + fix setting tags on keys by @krrishdholakia in #9038
- build(model_prices_and_context_window.json): update azure o1 mini pri… by @krrishdholakia in #9046
- Support master key rotations by @krrishdholakia in #9041
- (Feat) - add pricing for eu.amazon.nova models by @ishaan-jaff in #9056
- docs: Add project page for pgai by @Askir in #8576
- Mark several Claude models as being able to accept PDF inputs by @minhduc0711 in #9054
- (UI) - Keys Page - Show 100 Keys Per Page, Use full height, increase width of key alias by @ishaan-jaff in #9064
- (UI) Logs Page - Keep expanded log in focus on LiteLLM UI by @ishaan-jaff in #9061
- (Docs) OpenWeb x LiteLLM Docker compose + Instructions on spend tracking + logging by @ishaan-jaff in #9059
- (UI) - Allow adding Cerebras, Sambanova, Perplexity, Fireworks, Openrouter, TogetherAI Models on Admin UI by @ishaan-jaff in #9069
- UI - new API Playground for testing LiteLLM translation by @krrishdholakia in #9073
- Bug fix - String data: stripped from entire content in streamed Gemini responses by @ishaan-jaff in #9070
- (UI) - Minor improvements to logs page by @ishaan-jaff in #9076
- Bug fix: support bytes.IO when handling audio files for transcription by @tvishwanadha in #9071
- Fix batches api cost tracking + Log batch models in spend logs / standard logging payload by @krrishdholakia in #9077
- (UI) - Fix, Allow Filter Keys by Team Alias, Key Alias and Org by @ishaan-jaff in #9083
- (Clean up) - Allow switching off storing Error Logs in DB by @ishaan-jaff in #9084
- (UI) - Fix show correct count of internal user keys on Users Page by @ishaan-jaff in #9082
- New stable release notes by @krrishdholakia in #9085
- Litellm dev 03 08 2025 p3 by @krrishdholakia in #9089
- feat: prioritize api_key over tenant_id for more Azure AD token provi… by @krrishdholakia in #8701
- Fix incorrect streaming response by @5aaee9 in #9081
- Support openrouter
reasoning_content
on streaming by @krrishdholakia in #9094 - add support for Amazon Nova Canvas model by @omrishiv in #7838
- pricing for jamba new models by @themrzmaster in #9032
- build(deps): bump jinja2 from 3.1.4 to 3.1.6 by @dependabot in #9014
- (docs) add section for contributing to litellm by @ishaan-jaff in #9107
- build: Add Makefile for LiteLLM project with test targets by @colesmcintosh in #8948
- (Docs) - Contributing to litellm by @ishaan-jaff in #9110
- Added tags, user_feedback and model_options to additional_keys which can be sent to athina by @vivek-athina in #8845
- fix missing comma by @niinpatel in #8746
- Update model_prices_and_context_window.json by @mounta11n in #8757
- Fix triton streaming completions bug by @minwhoo in #8386
- (docs) Update vertex.md old code example by @santibreo in #7736
- (Feat) - Allow adding Text-Completion OpenAI models through UI by @ishaan-jaff in #9102
- docs(pr-template): update unit test command in checklist by @colesmcintosh in #9119
- [UI SSO Bug fix] - Correctly use
PROXY_LOGOUT_URL
when set by @ishaan-jaff in #9117 - Validate
model_prices_and_context_window.json
with a test, clarify possiblemode
values + ensure consistent use ofmode
by @utkashd in #8956 - JWT Auth Fix - [Bug]: JWT access with Groups not working when team is assigned All Proxy Models access by @ishaan-jaff in #8934
- fix(base_invoke_transformation.py): support extra_headers on bedrock … by @krrishdholakia in #9113
- feat(handle_jwt.py): support multiple jwt url's by @krrishdholakia in #9047
- Return
code
,param
andtype
on openai bad request error by @krrishdholakia in #9109 - feature: Handle ManagedIdentityCredential in Azure AD token provider by @you-n-g in #9135
- Adding/Update of models by @emerzon in #9120
- Update bedrock.md for variable consistency by @superpoussin22 in #8185
- ci: add helm unittest by @mknet3 in #9068
- [UI Fixes RBAC] - for Internal User Viewer Permissions by @ishaan-jaff in #9148
- Delegate router azure client init logic to azure provider by @krrishdholakia in #9140
- feat: add bedrock deepseek r1 model pricing by @kearnsw in #9108
- fix(internal_user_endpoints.py): allow internal user to query their o… by @krrishdholakia in #9162
- add support for Amazon Nova Canvas model (#7838) by @krrishdholakia in #9101
- Fix bedrock chunk parsing + azure whisper cost tracking by @krrishdholakia in #9166
- Bing Search Pass Thru by @sfarthin in #8019
- [Feat] Add OpenAI Responses API to litellm python SDK by @ishaan-jaff in #9155
- Support credential management on Proxy - via CRUD endpoints -
credentials/*
by @krrishdholakia in #9124 - Bump @babel/runtime-corejs3 from 7.26.0 to 7.26.10 in /docs/my-website by @dependabot in #9167
- Bump @babel/helpers from 7.26.0 to 7.26.10 in /docs/my-website by @dependabot in #9168
- fix(azure): Patch for Function Calling Bug & Update Default API Version to
2025-02-01-preview
by @colesmcintosh in #9191 - [Feat] - Add Responses API on LiteLLM Proxy by @ishaan-jaff in #9183
- gemini price updates: gemma 3, flash 2 thinking update, learnlm by @yigitkonur in #9190
- Mark Cohere Embedding 3 models as Multimodal by @emerzon in #9176
- Fix Metadata not updating in Team UI by @lucasra1 in #9180
- feat: initial commit adding support for credentials on proxy ui by @krrishdholakia in #9186
- Fix azure ai services url + add azure data zone pricing by @krrishdholakia in #9185
- (gemini)Handle HTTP 201 status code in Vertex AI response by @youngchannelforyou in #9193
- feat/postgres-volumes by @xucailiang in #8741
- [FEAT] Support for Snowflake REST API LLMs #7979 by @SunnyWan59 in #8950
- fix(azure.py): track azure llm api latency metric by @krrishdholakia in #9217
- Support bedrock converse cache token tracking by @krrishdholakia in #9221
- Emit audit logs on All user + model Create/Update/Delete endpoints by @krrishdholakia in #9223
- (UI Usage) - Allow clicking into Top Keys when showing users Top API Key by @ishaan-jaff in #9225
- [Feat] Add Snowflake Cortex to LiteLLM by @ishaan-jaff in #9222
- [Fixes] Responses API - allow /responses and subpaths as LLM API route + Add exception mapping for responses API by @ishaan-jaff in #9220
- docs: Add centralized credential management docs by @bexelbie in #9254
- Docs: Update configs.md by @bexelbie in #9263
- Support reusing existing model credentials by @krrishdholakia in #9267
- LiteLLM UI Fixes by @krrishdholakia in #9269
- Fix "system" role has become unacceptable in ollama by @briandevvn in #9261
- Litellm rc 03 14 2025 patch 1 by @krrishdholakia in #9271
- [Feat] UI - Add Test Connection by @ishaan-jaff in #9272
- [UI] Fix 1 - instantly show newly create keys on Admin UI (don't require refresh) by @ishaan-jaff in #9257
- (UI) Fix model edit + delete - instantly show edit + deletes to models by @ishaan-jaff in #9258
New Contributors
- @ogunoz made their first contribution in #9010
- @Askir made their first contribution in #8576
- @tvishwanadha made their first contribution in #9071
- @5aaee9 made their first contribution in #9081
- @mounta11n made their first contribution in #8757
- @minwhoo made their first contribution in #8386
- @santibreo made their first contribution in #7736
- @utkashd made their first contribution in #8956
- @kearnsw made their first contribution in #9108
- @sfarthin made their first contribution in #8019
- @lucasra1 made their first contribution in #9180
- @youngchannelforyou made their first contribution in #9193
- @xucailiang made their first contribution in #8741
- @SunnyWan59 made their first contribution in #8950
- @bexelbie made their first contribution in #9254
- @briandevvn made their first contribution in #9261
Full Changelog: v1.63.2-stable...v1.63.11-stable