github BerriAI/litellm v1.35.21-stable

latest releases: v1.52.6.dev1, v1.52.5.staging1, v1.52.6...
6 months ago

What's Changed

Full Changelog: v1.35.21...v1.35.21-stable

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 79 85.25828748870606 1.6266446092244486 0.0 487 0 71.95533700001988 603.0785590000107
/health/liveliness Passed ✅ 62 64.19436629612446 15.170882577202148 0.0033401326678120097 4542 1 59.702757999986034 1532.6318220000132
/health/readiness Passed ✅ 62 65.19207891768694 15.541637303329281 0.0033401326678120097 4653 1 59.73732999999015 1305.427782999999
Aggregated Passed ✅ 62 65.73335477463316 32.33916448975588 0.006680265335624019 9682 2 59.702757999986034 1532.6318220000132

Don't miss a new litellm release

NewReleases is sending notifications on new releases.