github BerriAI/litellm v1.36.0

latest releases: v1.46.8, v1.46.7, v1.46.6...
4 months ago

What's Changed

New Contributors

Full Changelog: v1.35.38-stable...v1.36.0

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 81 88.20979605555442 1.623255922227879 0.0 486 0 75.38953300002049 1264.5359969999959
/health/liveliness Passed ✅ 65 68.12845653229724 15.253929623075564 0.0 4567 0 63.39287800000193 1385.0202130000184
/health/readiness Passed ✅ 65 68.59345058785526 15.511112145733067 0.0033400327617857596 4644 1 63.46367399999053 1491.452105999997
Aggregated Passed ✅ 65 69.35759579210092 32.38829769103651 0.0033400327617857596 9697 1 63.39287800000193 1491.452105999997

Don't miss a new litellm release

NewReleases is sending notifications on new releases.