github BerriAI/litellm v1.35.29

latest releases: v1.40.0, v1.39.6, v1.39.5-stable...
one month ago

Full Changelog: v1.35.28.dev1...v1.35.29

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 42 49.42966661123069 1.5463498980603823 0.0 463 0 35.05075499998611 750.6745339999839
/health/liveliness Passed ✅ 25 28.80921247507392 15.810843234163821 0.0 4734 0 23.197104999979956 1141.1978890000114
/health/readiness Passed ✅ 26 28.33507187739932 15.30986594537536 0.0 4584 0 23.20666399998572 1207.45605999997
Aggregated Passed ✅ 26 29.563103668745402 32.66705907759956 0.0 9781 0 23.197104999979956 1207.45605999997

Don't miss a new litellm release

NewReleases is sending notifications on new releases.