github BerriAI/litellm v1.35.17

latest releases: v1.41.0, v1.40.31, v1.40.29...
2 months ago

What's Changed

  • [Fix] completion(model="gemini/gemini-pro-1.5-latest" raises Exception by @ishaan-jaff in #3186

Full Changelog: v1.35.16...v1.35.17

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 93 101.53622583668972 1.493386660987606 0.0033409097561243983 447 1 84.96015899999065 1145.1051079999957
/health/liveliness Passed ✅ 78 80.17064068287783 15.044116631828166 0.0 4503 0 73.88907600000039 1396.897354000032
/health/readiness Passed ✅ 78 80.22288756746751 15.301366683049745 0.0 4580 0 73.93075199996701 1436.8102980000117
Aggregated Passed ✅ 78 81.19789223536186 31.838869975865517 0.0033409097561243983 9530 1 73.88907600000039 1436.8102980000117

Don't miss a new litellm release

NewReleases is sending notifications on new releases.