github BerriAI/litellm v1.32.3

latest releases: v1.52.6, v1.52.5, v1.52.4...
7 months ago

🚨 Nightly Build - We noticed testing was flaky on this release

Group 5769
https://docs.litellm.ai/docs/proxy/logging

LiteLLM v1.32.3 📈 Proxy 100+ LLMs in one format + Send logs to Datadog. Start here: https://docs.litellm.ai/docs/proxy/logging

👉 Admin UI: Bug Fix for viewing total spend on LiteLLM Proxy https://docs.litellm.ai/docs/proxy/ui

💵 New /global/spend endpoint -> get total spend on proxy, total proxy budget if set

📖 Docs fix - view Pre Call hooks https://docs.litellm.ai/docs/proxy/call_hooks

📖 Docs - Using LiteLLM Proxy + Datadog for sending LLM logs:
https://docs.litellm.ai/docs/proxy/logging#logging-proxy-inputoutput---datadog

What's Changed

New Contributors

Full Changelog: v1.32.1...v1.32.3

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 87 95.22148051583896 1.4766004734948956 0.0 442 0 81.41332799999645 1162.7792299999555
/health/liveliness Passed ✅ 62 65.23377382330264 15.597845273184769 0.006681450106311745 4669 2 59.51579199995649 1170.440439999993
/health/readiness Passed ✅ 62 65.01079326000487 15.110099415424012 0.0 4523 0 59.88310399993679 1250.393838999912
Aggregated Passed ✅ 62 66.5048995520034 32.184545162103674 0.006681450106311745 9634 2 59.51579199995649 1250.393838999912

Don't miss a new litellm release

NewReleases is sending notifications on new releases.