github BerriAI/litellm v1.35.12

latest releases: 1.41.12.dev1, v1.41.12, 1.41.11.dev5...
2 months ago

What's Changed

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Full Changelog: v1.35.11...v1.35.12

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 81 90.18032295762929 1.5764885124209485 0.0 472 0 75.0108780000005 1097.2126149999895
/health/liveliness Passed ✅ 66 68.85966111970596 15.457603465008791 0.0 4628 0 63.27401099997587 1008.0632059999743
/health/readiness Passed ✅ 66 69.48201319830875 15.006701030312122 0.003340018034790145 4493 1 63.539215999981025 1354.8842850000256
Aggregated Passed ✅ 66 70.20017819222355 32.04079300774186 0.003340018034790145 9593 1 63.27401099997587 1354.8842850000256

Don't miss a new litellm release

NewReleases is sending notifications on new releases.