github BerriAI/litellm v1.55.0.dev2

latest releases: v1.55.2, v1.55.1
20 hours ago

What's Changed

  • (feat) add response_time to StandardLoggingPayload - logged on datadog, gcs_bucket, s3_bucket etc by @ishaan-jaff in #7199
  • build(deps): bump nanoid from 3.3.7 to 3.3.8 in /ui by @dependabot in #7198
  • (Feat) DataDog Logger - Add HOSTNAME and POD_NAME to DataDog logs by @ishaan-jaff in #7189
  • (feat) add error_code, error_class, llm_provider to StandardLoggingPayload by @ishaan-jaff in #7200
  • (docs) Document StandardLoggingPayload Spec by @ishaan-jaff in #7201
  • fix: Support WebP image format and avoid token calculation error by @ishaan-jaff in #7182
  • (feat) UI - Disable Usage Tab once SpendLogs is 1M+ Rows by @ishaan-jaff in #7208

Full Changelog: v1.55.0...v1.55.0.dev2

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.55.0.dev2

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 210.0 236.69042419128075 6.133942906309277 0.0 1835 0 175.69668400000182 4096.7015589999955
Aggregated Passed ✅ 210.0 236.69042419128075 6.133942906309277 0.0 1835 0 175.69668400000182 4096.7015589999955

Don't miss a new litellm release

NewReleases is sending notifications on new releases.