github BerriAI/litellm v1.36.1

latest releases: v1.48.6, v1.48.5.dev1, v1.48.5-stable...
4 months ago

🚨 Known Issue with Slack Alerting + Redis Cache on this Version

Last stable version = 1.36.0

What's Changed

Full Changelog: v1.36.0...v1.36.1

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 77 82.17942278026948 1.4893326985503708 0.0 446 0 71.08463500000539 1285.614862999978
/health/liveliness Passed ✅ 61 65.19868741610614 15.384205700048337 0.006678621966593591 4607 2 58.9997550000021 1581.6415380000421
/health/readiness Passed ✅ 62 64.47610141605732 15.514438828396912 0.0 4646 0 58.92144300000268 1383.0270929999813
Aggregated Passed ✅ 62 65.63339959428805 32.38797722699562 0.006678621966593591 9699 2 58.92144300000268 1581.6415380000421

Don't miss a new litellm release

NewReleases is sending notifications on new releases.