github BerriAI/litellm v1.35.35

latest releases: v1.47.1, v1.47.0, v1.46.8...
4 months ago

What's Changed

Full Changelog: v1.35.34...v1.35.35

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 41 51.26542280364234 1.6500892298075267 0.0 494 0 35.884482000028584 1109.5307290000278
/health/liveliness Passed ✅ 25 28.873856892782293 15.919686779883952 0.0 4766 0 23.275566999984676 1375.641074000015
/health/readiness Passed ✅ 26 28.245912157308112 15.288377337710626 0.003340261598800661 4577 1 23.47498500000711 1363.4175379999647
Aggregated Passed ✅ 26 29.706156425739458 32.8581533474021 0.003340261598800661 9837 1 23.275566999984676 1375.641074000015

Don't miss a new litellm release

NewReleases is sending notifications on new releases.