github BerriAI/litellm v1.35.24

6 months ago

🚨 UI Issue - INVESTIGATING. Use v1.35.23 for last stable UI release.

Full Changelog: v1.35.24.dev1...v1.35.24

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 57 62.57754433107982 1.482967464223559 0.0 444 0 50.958461999982774 817.4045920000026
/health/liveliness Passed ✅ 40 43.222332060980996 15.664678845064172 0.003340016811314322 4690 1 38.411224000014954 999.1422629999818
/health/readiness Passed ✅ 41 43.62012660392403 15.490997970875826 0.003340016811314322 4638 1 38.66341999997758 1356.7161670000019
Aggregated Passed ✅ 41 44.29055610294732 32.638644280163554 0.006680033622628644 9772 2 38.411224000014954 1356.7161670000019

Don't miss a new litellm release

NewReleases is sending notifications on new releases.