github BerriAI/litellm v1.53.4

latest releases: v1.55.8, v1.55.4-test-release-2, v1.55.4-test-release...
17 days ago

What's Changed

  • (QOL fix) - remove duplicate code from datadog logger by @ishaan-jaff in #7013
  • (UI) Sub 1s Internal User Tab load time by @ishaan-jaff in #7007
  • (fix) allow gracefully handling DB connection errors on proxy by @ishaan-jaff in #7017
  • (refactor) - migrate router.deployment_callback_on_success to use StandardLoggingPayload by @ishaan-jaff in #7015
  • (fix) 'utf-8' codec can't encode characters error on OpenAI by @ishaan-jaff in #7018

Full Changelog: v1.53.3...v1.53.4

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.53.4

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 250.0 291.34812296252045 6.153959693113714 0.0 1841 0 223.70142199997645 2984.8669300000097
Aggregated Passed ✅ 250.0 291.34812296252045 6.153959693113714 0.0 1841 0 223.70142199997645 2984.8669300000097

Don't miss a new litellm release

NewReleases is sending notifications on new releases.