What's Changed
- fix(utils.py): set max_retries = num_retries, if given by @krrishdholakia in #5143
- fix(litellm_logging.py): fix calling success callback w/ stream_options true by @krrishdholakia in #5145
Full Changelog: v1.43.5...v1.43.6-stable
Docker Run LiteLLM Proxy
docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.43.6-stable
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
---|---|---|---|---|---|---|---|---|---|
/chat/completions | Passed ✅ | 110.0 | 130.1134297451799 | 6.418269735355047 | 0.0 | 1919 | 0 | 97.57446099996514 | 934.8782779999851 |
Aggregated | Passed ✅ | 110.0 | 130.1134297451799 | 6.418269735355047 | 0.0 | 1919 | 0 | 97.57446099996514 | 934.8782779999851 |