github BerriAI/litellm v1.61.13-nightly

latest release: v1.61.13.rc
one day ago

What's Changed

  • LiteLLM Contributor PRs (02/18/2025). by @krrishdholakia in #8643
  • fix(utils.py): handle token counter error when invalid message passed in by @krrishdholakia in #8670
  • (Bug fix) - Cache Health not working when configured with prometheus service logger by @ishaan-jaff in #8687
  • (Redis fix) - use mget_non_atomic by @ishaan-jaff in #8682
  • (Observability) - Add more detailed dd tracing on Proxy Auth, Bedrock Auth by @ishaan-jaff in #8693
  • (Infra/DB) - Allow running older litellm version when out of sync with current state of DB by @ishaan-jaff in #8695

Full Changelog: v1.61.11-nightly...v1.61.13-nightly

## Docker Run LiteLLM Proxy

```
docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.61.13-nightly

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Failed ❌ 150.0 177.70403054473582 6.34954513279557 6.34954513279557 1900 1900 131.07585400001653 3605.6919650000054
Aggregated Failed ❌ 150.0 177.70403054473582 6.34954513279557 6.34954513279557 1900 1900 131.07585400001653 3605.6919650000054

Don't miss a new litellm release

NewReleases is sending notifications on new releases.