What's Changed
- feat(langsmith.py): support sampling langsmith traces by @krrishdholakia in #5577
- fix missing class object instantiation in custom_llm_server provider documentation's quick start by @pradhyumna85 in #5578
- litellm-helm: fix missing resource definitions in initContainer and missing DBname value for envVars in deployment.yaml by @Pit-Storm in #5562
- [Feat] Allow setting up Redis Cluster using .env vars by @ishaan-jaff in #5579
- [Feat] Slack Alerting - Allow setting custom spend report frequency by @ishaan-jaff in #5581
- [Feat UI] allow setting input / output cost per M tokens by @ishaan-jaff in #5582
- [Docs] - Add Lifecycle of a request through LiteLLM Gateway by @ishaan-jaff in #5585
- Feat - Proxy add /key/list endpoint by @ishaan-jaff in #5586
New Contributors
- @pradhyumna85 made their first contribution in #5578
- @Pit-Storm made their first contribution in #5562
Full Changelog: v1.44.21...v1.44.22-stable
Docker Run LiteLLM Proxy
docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.44.22-stable
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
---|---|---|---|---|---|---|---|---|---|
/chat/completions | Passed ✅ | 140.0 | 170.75772394186066 | 6.327918225705201 | 0.0 | 1892 | 0 | 112.68324600001733 | 5122.226861000001 |
Aggregated | Passed ✅ | 140.0 | 170.75772394186066 | 6.327918225705201 | 0.0 | 1892 | 0 | 112.68324600001733 | 5122.226861000001 |