github BerriAI/litellm v1.74.9-stable

latest releases: v1.76.2-nightly, v1.76.1.rc.2, v1.76.1.rc.1...
one month ago

What's Changed

  • Litellm release notes 07 27 2025 p1 by @krrishdholakia in #13027
  • VertexAI - camelcase optional params for image generation + Anthropic - streaming, always ensure assistant role set on only first chunk by @krrishdholakia in #12889
  • Bulk User Edit - additional improvements - edit all users + set 'no-default-models' on all users by @krrishdholakia in #12925
  • add X-Initiator header for GitHub Copilot to reduce premium requests by @ckoehler in #13016
  • docs - openwebui show how to include reasoning content for gemini models by @ishaan-jaff in #13060

New Contributors

Full Changelog: v1.74.9.rc-draft...v1.74.9-stable

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:litellm_stable_release_branch-v1.74.9-stable

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 110.0 151.02663461606545 6.447232352830848 0.0 1930 0 81.53728299998875 1408.0881720000207
Aggregated Passed ✅ 110.0 151.02663461606545 6.447232352830848 0.0 1930 0 81.53728299998875 1408.0881720000207

Don't miss a new litellm release

NewReleases is sending notifications on new releases.