Excited to Launch the ability for LiteLLM Gateway (Proxy) Users to - Create Virtual Keys for 100+ LLMs and track their own Usage Start here: https://github.com/BerriAI/litellm/releases/tag/v1.44.19-stable
✨ [UI] Show when a virtual key expires
✨ UI - show expired virtual keys on Admin UI
What's Changed
- fix KeyError when call deepseek api by @wolf-joe in #5530
- [UI] Show when a virtual key expires by @ishaan-jaff in #5541
- [Fix-Proxy] allow internal user and internal viewer to view usage by @ishaan-jaff in #5536
- LiteLLM Merged PR's by @krrishdholakia in #5538
- Update lago.py to accomodate API change (#5495) by @krrishdholakia in #5543
- LiteLLM Minor Fixes and Improvements by @krrishdholakia in #5537
- [Fix] transcription/atranscription file parameter should accept correct types by @ishaan-jaff in #5534
New Contributors
Full Changelog: v1.44.18...v1.44.19-stable
Docker Run LiteLLM Proxy
docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.44.19-stable
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
---|---|---|---|---|---|---|---|---|---|
/chat/completions | Passed ✅ | 84 | 98.93160195485267 | 6.438868020860198 | 0.0 | 1927 | 0 | 68.17092200003572 | 1581.8675439999765 |
Aggregated | Passed ✅ | 84 | 98.93160195485267 | 6.438868020860198 | 0.0 | 1927 | 0 | 68.17092200003572 | 1581.8675439999765 |