github BerriAI/litellm v1.35.32

latest releases: v1.52.5, v1.52.4, v1.52.0-stable...
6 months ago

What's Changed

  • feat(utils.py): unify common auth params across azure/vertex_ai/bedrock/watsonx by @krrishdholakia in #3331
  • protected_namespaces warning fixed for model_name & model_info by @CyanideByte in #3334
  • fix(utils.py): replicate now also has token based pricing for some models by @krrishdholakia in #3354
  • docs - update track cost with custom callbacks by @ishaan-jaff in #3359

New Contributors

Full Changelog: v1.35.31...v1.35.32

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 92 97.6487390778697 1.6300101877251756 0.0 488 0 85.23569400000497 1043.820304999997
/health/liveliness Passed ✅ 77 81.78410904708275 15.394911793494536 0.010020554432736735 4609 3 73.86755400000311 1392.7269499999966
/health/readiness Passed ✅ 78 79.75181014769441 14.994089616185068 0.003340184810912245 4489 1 74.02469199999473 1287.0916979999834
Aggregated Passed ✅ 78 81.64003953901504 32.01901159740478 0.01336073924364898 9586 4 73.86755400000311 1392.7269499999966

Don't miss a new litellm release

NewReleases is sending notifications on new releases.