github BerriAI/litellm v1.35.15

latest releases: v1.52.9, v1.52.8.dev1, v1.52.8...
7 months ago

What's Changed

  • usage based routing v2 improvements - unit testing + NEW async + sync 'pre_call_checks' by @krrishdholakia in #3153

Full Changelog: v1.35.14...v1.35.15

Don't want to maintain your internal proxy? get in touch πŸŽ‰Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed βœ… 61 64.26716269503412 1.4128166144037744 0.0 423 0 55.82698000000619 518.0516049999824
/health/liveliness Passed βœ… 45 47.47650433469414 15.547662742433971 0.0 4655 0 42.33338000000231 1026.14588900002
/health/readiness Passed βœ… 45 47.47785031392411 15.544322750437745 0.0 4654 0 42.32649900001206 1229.5602290000716
Aggregated Passed βœ… 45 48.206951588471156 32.50480210727549 0.0 9732 0 42.32649900001206 1229.5602290000716

Don't miss a new litellm release

NewReleases is sending notifications on new releases.