github BerriAI/litellm v1.41.8

latest releases: v1.52.6, v1.52.5, v1.52.4...
4 months ago

🔥 Excited to launch support for Logging LLM I/O on 🔭 Galileo through LiteLLM (YC W23) Proxy https://docs.litellm.ai/docs/proxy/logging#logging-llm-io-to-galielo

📈 [docs] New example Grafana Dashboards https://github.com/BerriAI/litellm/tree/main/cookbook/litellm_proxy_server/grafana_dashboard

🛡️ feat - control guardrails per api key https://docs.litellm.ai/docs/proxy/guardrails#switch-guardrails-onoff-per-api-key

🛠️ fix - raise report Anthropic streaming errors (thanks David Manouchehri)

✨ [Fix] Add nvidia nim param mapping based on model passed

Group 5879

What's Changed

New Contributors

Full Changelog: v1.41.7...v1.41.8

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.41.8

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 120.0 148.48763956993193 6.382118352365276 0.0 1909 0 109.10986900000808 1689.413720999994
Aggregated Passed ✅ 120.0 148.48763956993193 6.382118352365276 0.0 1909 0 109.10986900000808 1689.413720999994

Don't miss a new litellm release

NewReleases is sending notifications on new releases.