What's Changed
- fix(router.py): cooldown on 404 errors by @krrishdholakia in #3926
- [Feat] LiteLLM Proxy - use enums for user roles by @ishaan-jaff in #3927
- UI - View user role on admin ui by @ishaan-jaff in #3930
- [UI] edit user role admin UI by @ishaan-jaff in #3929
- fix: add missing seed parameter to ollama input #3923 by @devdev999 in #3924
- feat(main.py): support openai tts endpoint by @krrishdholakia in #3928
- [Feat] UI - cleanup editing users by @ishaan-jaff in #3931
- [Feat- admin UI] Show number of rate limit errors by deployment per day by @ishaan-jaff in #3932
New Contributors
- @devdev999 made their first contribution in #3924
Full Changelog: v1.39.4...v1.39.5
Docker Run LiteLLM Proxy
docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.39.5
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
---|---|---|---|---|---|---|---|---|---|
/chat/completions | Passed ✅ | 130.0 | 168.39339172958924 | 6.4252258901831345 | 0.0 | 1923 | 0 | 109.15407800001731 | 1833.3729599999913 |
Aggregated | Passed ✅ | 130.0 | 168.39339172958924 | 6.4252258901831345 | 0.0 | 1923 | 0 | 109.15407800001731 | 1833.3729599999913 |