github BerriAI/litellm v1.35.18

latest releases: v1.52.6.dev1, v1.52.5.staging1, v1.52.6...
6 months ago

🚨 'simple-shuffle' async migration, missing random + tpm shuffle. Added in v1.35.20.

What's Changed

New Contributors

Full Changelog: v1.35.17...v1.35.18

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 81 84.52471885930657 1.5432420439247365 0.0 462 0 75.13792499997862 512.1452240000224
/health/liveliness Passed ✅ 66 67.80148838042581 15.085023961827078 0.0 4516 0 63.41557500002182 575.8615130000067
/health/readiness Passed ✅ 66 68.58760093607881 15.57271517051325 0.0 4662 0 63.40643299995463 1172.486718000016
Aggregated Passed ✅ 66 68.98312626587143 32.20098117626507 0.0 9640 0 63.40643299995463 1172.486718000016

Don't miss a new litellm release

NewReleases is sending notifications on new releases.