What's Changed
- Add simple OpenTelemetry tracer by @yujonglee in #3974
- [FEAT] Add native OTEL logging to LiteLLM by @ishaan-jaff in #4010
- [Docs] Use OTEL logging on LiteLLM Proxy by @ishaan-jaff in #4011
- fix(bedrock): raise nested error response by @pharindoko in #3989
- [Feat] Admin UI - Add, Edit all LiteLLM callbacks on UI by @ishaan-jaff in #4014
- feat(assistants/main.py): add assistants api streaming support by @krrishdholakia in #4012
- feat(utils.py): Support
stream_options
param across all providers by @krrishdholakia in #4015 - fix(utils.py): fix cost calculation for openai-compatible streaming object by @krrishdholakia in #4009
- [Fix] Admin UI Internal Users by @ishaan-jaff in #4016
Full Changelog: v1.40.1...v1.40.1.dev4
Docker Run LiteLLM Proxy
docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.40.1.dev4
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
---|---|---|---|---|---|---|---|---|---|
/chat/completions | Passed ✅ | 110.0 | 130.49834083376624 | 6.432223242582805 | 0.0 | 1925 | 0 | 92.76206099997353 | 2155.1117690000297 |
Aggregated | Passed ✅ | 110.0 | 130.49834083376624 | 6.432223242582805 | 0.0 | 1925 | 0 | 92.76206099997353 | 2155.1117690000297 |