github BerriAI/litellm v1.35.19

latest releases: v1.52.9, v1.52.8.dev1, v1.52.8...
6 months ago

What's Changed

https://docs.litellm.ai/docs/proxy/alerting
pika-1713849265120-1x

Full Changelog: v1.35.18...v1.35.19

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 41 46.08613956783032 1.7233151155653574 0.0 516 0 36.1015739999857 419.30066000000465
/health/liveliness Passed ✅ 25 28.66264168568902 15.215937338208855 0.0 4556 0 23.215592999974888 1240.9464259999936
/health/readiness Passed ✅ 25 28.984080272844423 15.569951683654452 0.0 4662 0 23.317095999999538 1266.7914550000319
Aggregated Passed ✅ 25 29.74021222200552 32.50920413742866 0.0 9734 0 23.215592999974888 1266.7914550000319

Don't miss a new litellm release

NewReleases is sending notifications on new releases.