github BerriAI/litellm v1.38.4

latest releases: v1.41.7, v1.41.6.dev1, v1.41.6...
one month ago

Full Changelog: v1.38.3...v1.38.4

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.38.4

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Failed ❌ 8 10.442527492812866 1.6264559424753455 1.6264559424753455 487 487 6.631490000017948 205.98868900000866
/health/liveliness Failed ❌ 8 10.048330618204664 15.666745022899066 15.666745022899066 4691 4691 6.471762000018089 410.1658490000091
/health/readiness Failed ❌ 8 10.116838345250969 15.506437250334761 15.506437250334761 4643 4643 6.338718999984394 276.63078300000166
Aggregated Failed ❌ 8 10.100265783117646 32.79963821570917 32.79963821570917 9821 9821 6.338718999984394 410.1658490000091

Don't miss a new litellm release

NewReleases is sending notifications on new releases.