github BerriAI/litellm v1.36.4

latest releases: v1.52.9, v1.52.8.dev1, v1.52.8...
6 months ago

What's Changed

💵 Excited to launch - End User Cost Tracking on LiteLLM v1.36.4 Start here: https://docs.litellm.ai/docs/proxy/users

🔨 [Fix] Fixed completion_cost calculation issue in litellm.completion_cost()

🖼️ [UI] Implemented display of End-User Usage on Usage Tab

🚨 [Feat] Added alert functionality for cooling down deployment

🛠️ PR -Added support for stream_options OpenAI parameter.
pika-1715233635688-1x

Full Changelog: v1.36.3...v1.36.4

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 41 46.21350117257555 1.4127912157613525 0.0 423 0 35.30141800001729 483.34751900000583
/health/liveliness Passed ✅ 25 27.78995257100329 15.734419426599837 0.0 4711 0 23.17750600002455 1106.7859930000168
/health/readiness Passed ✅ 26 27.884833944178954 15.377046707719307 0.0033399319521544976 4604 1 23.45026500000813 1503.1187210000212
Aggregated Passed ✅ 26 28.63509478712216 32.5242573500805 0.0033399319521544976 9738 1 23.17750600002455 1503.1187210000212

Don't miss a new litellm release

NewReleases is sending notifications on new releases.