github BerriAI/litellm v1.35.23

6 months ago

What's Changed

Full Changelog: v1.35.21...v1.35.23

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 95 104.34841019318016 1.469716999607524 0.0 440 0 86.94174600003635 1376.3201009999761
/health/liveliness Passed ✅ 78 80.76396643555898 15.161466957314891 0.0066805318163978365 4539 2 73.81250000003092 1444.6542679999652
/health/readiness Passed ✅ 78 80.15375961581503 15.50217407995118 0.0 4641 0 74.04167700002517 931.0259220000034
Aggregated Passed ✅ 78 81.54828924251524 32.133358036873595 0.0066805318163978365 9620 2 73.81250000003092 1444.6542679999652

Don't miss a new litellm release

NewReleases is sending notifications on new releases.