github BerriAI/litellm v1.35.26

latest releases: v1.52.8, v1.52.5-stable, v1.52.6.dev1...
6 months ago

What's Changed

Full Changelog: v1.35.25...v1.35.26

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 81 88.94697354090745 1.469613483633916 0.0 440 0 75.7317819999912 1133.430020999981
/health/liveliness Passed ✅ 65 68.23953488817554 15.083578391115374 0.0 4516 0 63.34921199999144 1308.4431269999754
/health/readiness Passed ✅ 66 68.74088326275232 15.584582987808755 0.0 4666 0 63.6032900000032 1463.0823009999858
Aggregated Passed ✅ 66 69.42957485107057 32.13777486255805 0.0 9622 0 63.34921199999144 1463.0823009999858

Don't miss a new litellm release

NewReleases is sending notifications on new releases.