github BerriAI/litellm v1.37.2

latest releases: v1.52.9, v1.52.8.dev1, v1.52.8...
6 months ago

What's Changed

New Contributors

Full Changelog: v1.37.0.dev2_completion_cost...v1.37.2

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.37.2

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Failed ❌ 24 28.59593037362605 1.5197959088318929 1.5197959088318929 455 455 22.671621000029063 184.80915000003506
/health/liveliness Failed ❌ 23 27.673046850246536 15.568722485858137 15.568722485858137 4661 4661 21.451024999976198 1771.8764150000084
/health/readiness Failed ❌ 23 28.361425038412307 15.652227755574176 15.652227755574176 4686 4686 21.433796999986043 1998.6570389999656
Aggregated Failed ❌ 23 28.044976272087183 32.74074615026421 32.74074615026421 9802 9802 21.433796999986043 1998.6570389999656

Don't miss a new litellm release

NewReleases is sending notifications on new releases.