github BerriAI/litellm v1.35.36

latest releases: v1.48.14-stable, v1.48.14, v1.48.12...
5 months ago

🚨 🚨 Detected perf issue with litellm lowest-latency-routing.

🚨 🚨 last stable release 1.35.32

What's Changed

  • Fix route /openai/deployments/{model}/chat/completions not working properly by @msabramo in #3375
  • Litellm gh 3372 by @msabramo in #3402
  • Vision for Claude 3 Family + Info for Azure/GPT-4-0409 by @azohra in #3405

New Contributors

Full Changelog: v1.35.35.dev1...v1.35.36

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 91 97.80587342499896 1.4697626815590719 0.0033403697308160727 440 1 85.02641000001177 1179.9364940000032
/health/liveliness Passed ✅ 75 79.456162522843 15.135215250327624 0.0 4531 0 73.27406199999587 1342.852516000022
/health/readiness Passed ✅ 75 78.28620318150432 15.49597518125576 0.0 4639 0 73.4411589999695 1271.3810550000062
Aggregated Passed ✅ 75 79.73154560426654 32.10095311314246 0.0033403697308160727 9610 1 73.27406199999587 1342.852516000022

Don't miss a new litellm release

NewReleases is sending notifications on new releases.