github BerriAI/litellm v1.71.1-nightly

latest releases: v1.76.2-nightly, v1.76.1.rc.2, v1.76.1.rc.1...
3 months ago

What's Changed

  • Logfire - fix(opentelemetry.py): Fix otel proxy server initialization + Return abbreviated key in key not found error (easier clientside debugging) + Ignore invalid deployments on router load by @krrishdholakia in #11091
  • feat(handle_jwt.py): map user to team when added via jwt auth by @krrishdholakia in #11108
  • fix(ui_sso.py): maintain backwards compatibility for older user id formats + fix existing user email w/ trailing whitespace check + ensure default_internal_user_settings runs on all user new calls by @krrishdholakia in #11106
  • fix(route_llm_request.py): map team model from list in route llm request by @krrishdholakia in #11111
  • Remove + Check for unsafe enterprise/ folder imports by @krrishdholakia in #11107
  • Fix: Add Claude Sonnet 4 and Opus 4 support for reasoning_effort parameter by @keykbd in #11114
  • fix(session): correctly place litellm_session_id at root level instead of metadata by @dalssoft in #11088
  • fix(model_management_endpoints): clear cache and reload models after update by @jtong99 in #10853
  • [Feat] Add /image/edits on LiteLLM by @ishaan-jaff in #11123
  • Correctly delete team model alias when team only model is deleted (#… by @krrishdholakia in #11121
  • fix: detect and return status codes in streaming responses by @aholmberg in #10962
  • Fix passing standard optional params by @krrishdholakia in #11124
  • UI QA fix: team viewer should not see create team by @ishaan-jaff in #11127
  • [Chore]: feature flag aiohttp transport - users should opt into using aiohttp transport by @ishaan-jaff in #11132
  • v1.71.1-stable - notes by @ishaan-jaff in #11133
  • Litellm revert redis changes by @krrishdholakia in #11135
  • Litellm fix multi instance checks on teams by @krrishdholakia in #11137

New Contributors

Full Changelog: v1.71.0-nightly...v1.71.1-nightly

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.71.1-nightly

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 250.0 271.62186726419185 6.123952252359233 0.0 1832 0 215.75241199997208 1968.6522410000293
Aggregated Passed ✅ 250.0 271.62186726419185 6.123952252359233 0.0 1832 0 215.75241199997208 1968.6522410000293

Don't miss a new litellm release

NewReleases is sending notifications on new releases.