What's Changed
- fix(team_endpoints.py): allow team member to view team info by @krrishdholakia in #8644
- build: build ui by @krrishdholakia in #8654
- (UI + Proxy) Cache Health Check Page - Cleanup/Improvements by @ishaan-jaff in #8665
- (Bug Fix Redis) - Fix running redis.mget operations with
None
Keys by @ishaan-jaff in #8666 - (Bug fix) prometheus - safely set latency metrics by @ishaan-jaff in #8669
- extract
<think>..</think>
block for amazon deepseek r1 and put inreasoning_content
by @krrishdholakia in #8664 - Add all
/key/generate
api params to UI + add metadata fields on team AND org add/update by @krrishdholakia in #8667
Full Changelog: v1.61.9-nightly...v1.61.11-nightly
## Docker Run LiteLLM Proxy
```
docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.61.11-nightly
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
---|---|---|---|---|---|---|---|---|---|
/chat/completions | Failed ❌ | 120.0 | 146.33082595240526 | 6.457801208431416 | 6.457801208431416 | 1933 | 1933 | 97.35924100004922 | 4080.5825460000165 |
Aggregated | Failed ❌ | 120.0 | 146.33082595240526 | 6.457801208431416 | 6.457801208431416 | 1933 | 1933 | 97.35924100004922 | 4080.5825460000165 |