github BerriAI/litellm v1.61.2-nightly

8 days ago

What's Changed

Full Changelog: v1.61.1...v1.61.2-nightly

## Docker Run LiteLLM Proxy

```
docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.61.2-nightly

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Failed ❌ 180.0 216.33586769555853 6.245273580245063 6.245273580245063 1869 1869 145.7912179999994 3665.8740830000056
Aggregated Failed ❌ 180.0 216.33586769555853 6.245273580245063 6.245273580245063 1869 1869 145.7912179999994 3665.8740830000056

Don't miss a new litellm release

NewReleases is sending notifications on new releases.