github BerriAI/litellm v1.61.3-stable

6 months ago

Full Changelog: v1.61.3...v1.61.3-stable

## Docker Run LiteLLM Proxy

```
docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm_stable_release_branch-v1.61.3-stable

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Failed ❌ 150.0 178.8079934268893 6.237874881624696 6.237874881624696 1867 1867 130.88419100000692 2746.1132829999997
Aggregated Failed ❌ 150.0 178.8079934268893 6.237874881624696 6.237874881624696 1867 1867 130.88419100000692 2746.1132829999997

Don't miss a new litellm release

NewReleases is sending notifications on new releases.