github BerriAI/litellm v1.40.8

latest releases: v1.41.2-stable, v1.41.2, v1.41.1...
19 days ago

What's Changed

Client Side Fallbacks: https://docs.litellm.ai/docs/proxy/reliability#test---client-side-fallbacks

fallbacks py

Full Changelog: v1.40.7...v1.40.8

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.40.8

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.40.8

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 140.0 169.11120714803027 6.281005310183787 0.0 1878 0 114.50119100004486 1457.4686270000257
Aggregated Passed ✅ 140.0 169.11120714803027 6.281005310183787 0.0 1878 0 114.50119100004486 1457.4686270000257

Don't miss a new litellm release

NewReleases is sending notifications on new releases.