github BerriAI/litellm v1.55.1

latest release: v1.55.2
8 hours ago

What's Changed

  • (feat) add response_time to StandardLoggingPayload - logged on datadog, gcs_bucket, s3_bucket etc by @ishaan-jaff in #7199
  • build(deps): bump nanoid from 3.3.7 to 3.3.8 in /ui by @dependabot in #7198
  • (Feat) DataDog Logger - Add HOSTNAME and POD_NAME to DataDog logs by @ishaan-jaff in #7189
  • (feat) add error_code, error_class, llm_provider to StandardLoggingPayload by @ishaan-jaff in #7200
  • (docs) Document StandardLoggingPayload Spec by @ishaan-jaff in #7201
  • fix: Support WebP image format and avoid token calculation error by @ishaan-jaff in #7182
  • (feat) UI - Disable Usage Tab once SpendLogs is 1M+ Rows by @ishaan-jaff in #7208
  • (minor fix proxy) Clarify Proxy Rate limit errors are showing hash of litellm virtual key by @ishaan-jaff in #7210
  • (fix) latency fix - revert prompt caching check on litellm router by @ishaan-jaff in #7211

Full Changelog: v1.55.0...v1.55.1

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.55.1

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 250.0 274.17864765330575 6.170501674094568 0.0 1846 0 212.15181599995958 2203.3609819999356
Aggregated Passed ✅ 250.0 274.17864765330575 6.170501674094568 0.0 1846 0 212.15181599995958 2203.3609819999356

Don't miss a new litellm release

NewReleases is sending notifications on new releases.