github BerriAI/litellm v1.35.0

latest releases: v1.52.0-stable, v1.52.2-dev1, v1.52.3...
7 months ago

Full Changelog: v1.34.42...v1.35.0

fix(caching.py): fix async batch redis get request by @krrishdholakia 76bd667

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 93 97.65305423059934 1.5068119533442674 0.0 451 0 86.16572899998687 1209.6832269999993
/health/liveliness Passed ✅ 79 80.3909614859205 15.305333832084456 0.0 4581 0 74.10348499999486 759.2023230000109
/health/readiness Passed ✅ 79 80.52192505215501 15.502455573209316 0.0 4640 0 74.02658799998107 967.6750630000015
Aggregated Passed ✅ 79 81.25871115250257 32.314601358638036 0.0 9672 0 74.02658799998107 1209.6832269999993

Don't miss a new litellm release

NewReleases is sending notifications on new releases.