github BerriAI/litellm v1.34.38

latest releases: v1.39.5-stable, v1.39.5, v1.39.4...
one month ago

What's Changed

  • [New Models] Add Voyage 2 embedding models by @ishaan-jaff in #2913
  • Fix issue #2832: Add protected_namespaces to Config class within utils.py, router.py and completion.py to avoid the warning message. by @unclecode in #2893
  • build(ui/admin.tsx): allow adding admins + upgrading viewer to admin by @krrishdholakia in #2919

New Contributors

Full Changelog: v1.34.37.dev1...v1.34.38

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 80 85.66869084033509 1.5897029409256755 0.003339712060768226 476 1 74.68973499999265 1233.7370480000232
/health/liveliness Passed ✅ 66 68.48476484741958 15.212388436799268 0.003339712060768226 4555 1 63.37483100003283 1288.7348689999953
/health/readiness Passed ✅ 66 68.19508600873709 15.289201814196938 0.010019136182304678 4578 3 63.61198400009016 1516.8382620000216
Aggregated Passed ✅ 66 69.1979919313138 32.09129319192188 0.016698560303841127 9609 5 63.37483100003283 1516.8382620000216

Don't miss a new litellm release

NewReleases is sending notifications on new releases.