github BerriAI/litellm v1.36.2-stable

6 months ago

Full Changelog: v1.36.2...v1.36.2-stable

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 82 86.14350377301878 1.5598228029353671 0.0 467 0 75.38395299997092 742.260956999985
/health/liveliness Passed ✅ 66 68.56318463091652 15.38446216342677 0.006680183310215706 4606 2 46.13641599999596 1770.3959170000303
/health/readiness Passed ✅ 66 68.15400066320011 15.093874189432386 0.0 4519 0 63.42066399997748 1238.310568000088
Aggregated Passed ✅ 66 69.2263317002717 32.038159155794524 0.006680183310215706 9592 2 46.13641599999596 1770.3959170000303

Don't miss a new litellm release

NewReleases is sending notifications on new releases.