github BerriAI/litellm v1.61.13.rc

one day ago

What's Changed

  • LiteLLM Contributor PRs (02/18/2025). by @krrishdholakia in #8643
  • fix(utils.py): handle token counter error when invalid message passed in by @krrishdholakia in #8670
  • (Bug fix) - Cache Health not working when configured with prometheus service logger by @ishaan-jaff in #8687
  • (Redis fix) - use mget_non_atomic by @ishaan-jaff in #8682
  • (Observability) - Add more detailed dd tracing on Proxy Auth, Bedrock Auth by @ishaan-jaff in #8693
  • (Infra/DB) - Allow running older litellm version when out of sync with current state of DB by @ishaan-jaff in #8695

Full Changelog: v1.61.11-nightly...v1.61.13.rc

## Docker Run LiteLLM Proxy

```
docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.61.13.rc

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Failed ❌ 150.0 182.1689693668056 6.422359014184563 6.422359014184563 1922 1922 131.7662689999679 3100.3508269999998
Aggregated Failed ❌ 150.0 182.1689693668056 6.422359014184563 6.422359014184563 1922 1922 131.7662689999679 3100.3508269999998

Don't miss a new litellm release

NewReleases is sending notifications on new releases.