github BerriAI/litellm stable

7 months ago

Full Changelog: v.1.32.34-stable...stable

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 56 64.83296719425923 1.5130259395707801 0.0 453 0 51.172277999967264 1483.2017450000023
/health/liveliness Passed ✅ 40 42.34952875849499 15.627921349341458 0.0 4679 0 38.26479700001073 999.4194509999943
/health/readiness Passed ✅ 41 43.801943172063055 15.78156195247668 0.0033400131116352763 4725 1 38.43308799997658 1401.527327999986
Aggregated Passed ✅ 41 44.07902614263921 32.92250924138892 0.0033400131116352763 9857 1 38.26479700001073 1483.2017450000023

Don't miss a new litellm release

NewReleases is sending notifications on new releases.