What's Changed
- Add mcp server segregation comma separated support by @jugaldb in #12326
- Fix: Preserve Live Tail State on Log Pages by @NANDINI-star in #12335
- [Feat] JWT - Sync user roles and team memberships when JWT Auth is used by @ishaan-jaff in #11994
- fix watsonx datetime conversion issue py3.10 by @isaken in #12339
- Patch 1 by @isaken in #12338
- UI - Add Azure Content Safety Guardrails (improved UX) by @krrishdholakia in #12330
- UI - Azure Content Guardrails by @krrishdholakia in #12341
- feat(vertex_ai/): add new deepseek-ai api service by @krrishdholakia in #12312
- v1.74.0.rc docs by @ishaan-jaff in #12344
- [Docs] vertex deepseek by @ishaan-jaff in #12345
- docs - 1.74.0.rc by @ishaan-jaff in #12347
- [UI QA] 1.74.0.rc by @ishaan-jaff in #12348
- fix: add proper type annotations for embedding() function by @colesmcintosh in #12262
- Remove stream options from streaming + fix guardrail start time on log duration by @krrishdholakia in #12346
- Add all guardrails to the UI by @krrishdholakia in #12349
- New
/key/service-account/generate
API Endpoint + Team member permissions for creating service account keys by @krrishdholakia in #12350
New Contributors
Full Changelog: v1.74.0-nightly...v1.74.0.rc
Docker Run LiteLLM Proxy
docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.74.0.rc
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
---|---|---|---|---|---|---|---|---|---|
/chat/completions | Passed ✅ | 250.0 | 280.8653649907248 | 6.125037961721809 | 0.0 | 1833 | 0 | 218.11327800003255 | 1918.5024649999605 |
Aggregated | Passed ✅ | 250.0 | 280.8653649907248 | 6.125037961721809 | 0.0 | 1833 | 0 | 218.11327800003255 | 1918.5024649999605 |