github BerriAI/litellm v1.32.4

7 months ago

Full Changelog: v1.32.3...v1.32.4

🚨 Nightly Build - We noticed testing was flaky on this release

fix(proxy/utils.py): move to batch writing db updates by @krrishdholakia in #2561

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 88 94.56061238333339 1.6031443291109193 0.0 480 0 83.96878100001004 1119.2803200000014
/health/liveliness Passed ✅ 66 68.92021606940695 14.869163652503778 0.0 4452 0 63.53916099999424 1227.7072139999632
/health/readiness Passed ✅ 66 68.07316587537812 15.436943935730561 0.0 4622 0 63.491317000000436 1315.2673790000051
Aggregated Passed ✅ 66 69.79862555589267 31.90925191734526 0.0 9554 0 63.491317000000436 1315.2673790000051

Don't miss a new litellm release

NewReleases is sending notifications on new releases.