What's Changed
Full Changelog: v1.37.13...v1.37.14
Docker Run LiteLLM Proxy
docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.37.14
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
---|---|---|---|---|---|---|---|---|---|
/chat/completions | Failed ❌ | 9 | 11.887553362704857 | 1.6297242673296988 | 1.6297242673296988 | 488 | 488 | 7.5520679999954154 | 178.089099999994 |
/health/liveliness | Failed ❌ | 8 | 10.902480935483862 | 15.529134924350613 | 15.529134924350613 | 4650 | 4650 | 6.3291929999991225 | 907.1240070000499 |
/health/readiness | Failed ❌ | 8 | 11.157714575899545 | 15.68609607304835 | 15.68609607304835 | 4697 | 4697 | 6.4579570000091735 | 1189.8105640000267 |
Aggregated | Failed ❌ | 8 | 11.073253457447953 | 32.84495526472866 | 32.84495526472866 | 9835 | 9835 | 6.3291929999991225 | 1189.8105640000267 |