😇 LiteLLM v1.38.7 - New Activity Tab, Track LLM API Requests & Total Tokens 👉 Start here: https://github.com/BerriAI/litellm
🔥 [Fix] Set budget_duration on /team/new and /team/update
🔥 [Feat] Supporting for Resetting Team Budgets on budget_reset_at https://docs.litellm.ai/docs/proxy/users
⚒️ [Feature]: Attach litellm exception in error string - ContentPolicyViolation, AuthenticationError
📧 [Docs]- setting up Email notifications https://docs.litellm.ai/docs/proxy/email
What's Changed
- [Feat] - Admin UI - New Activity Tab by @ishaan-jaff in #3836
- [Feat] Ui Enforce premium features on ui by @ishaan-jaff in #3840
- fix(proxy_server.py): fix model check for
/v1/models
+/model/info
endpoint when team has restricted access by @krrishdholakia in #3839 - [Fix] Set budget_duration on
/team/new
and/team/update
by @ishaan-jaff in #3842 - [Feat] Reset Team Budgets on
budget_reset_at
by @ishaan-jaff in #3843 - [Feature]: Attach litellm exception in error string by @ishaan-jaff in #3824
- docs- email notifs by @ishaan-jaff in #3845
Full Changelog: v1.38.5...v1.38.7
Docker Run LiteLLM Proxy
docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.38.7
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
---|---|---|---|---|---|---|---|---|---|
/chat/completions | Passed ✅ | 110.0 | 127.76486134384693 | 6.465849619551454 | 0.0 | 1934 | 0 | 97.91651000000456 | 1353.8686059999918 |
Aggregated | Passed ✅ | 110.0 | 127.76486134384693 | 6.465849619551454 | 0.0 | 1934 | 0 | 97.91651000000456 | 1353.8686059999918 |