github BerriAI/litellm v1.35.5

latest releases: v1.40.27, v1.40.26, v1.40.25...
2 months ago

Full Changelog: 1.35.5.dev2...v1.35.5

Call 100+ LLMS, run /health checks on Admin UI

👉 Edit + Test @langfuse and @slackhq configurations on the LiteLLM UI

🛠️ UI - fix - adding azure OpenAI on admin ui

⚡️ [Fix] Load proxy models when proxy starts up

✅ [LiteLLM UI] Show Error message for 10-20s (h/t Graham Neubig for this request)

😇 QA - Added Tests for /health endpoints on Proxy
Screenshot 2024-04-13 at 10 12 53 PM

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 92 97.28560691555401 1.5031977156223721 0.0 450 0 86.17738300006295 926.6810239999472
/health/liveliness Passed ✅ 78 80.18245176222777 15.299212305667698 0.003340439368049716 4580 1 74.2219969999951 1033.391943999959
/health/readiness Passed ✅ 78 80.9458593949494 15.342638017452344 0.0 4593 0 74.08499699999993 1307.7433869999595
Aggregated Passed ✅ 78 81.34661585617866 32.14504803874242 0.003340439368049716 9623 1 74.08499699999993 1307.7433869999595

Don't miss a new litellm release

NewReleases is sending notifications on new releases.