What's Changed
- [Fix - Proxy] Raise
type=ProxyErrorTypes.budget_exceeded,
on Exceeded budget errors by @ishaan-jaff in #4606 - feat(httpx): Send litellm user-agent version upstream by @Manouchehri in #4591
- fix(utils.py): change update to upsert by @andresrguzman in #4610
- [Proxy-Fix]: Add /assistants, /threads as OpenAI routes by @ishaan-jaff in #4611
- UI fixes - Send custom llm provider when adding a new model by @ishaan-jaff in #4609
New Contributors
- @andresrguzman made their first contribution in #4610
Full Changelog: v1.41.12...v1.41.13
Docker Run LiteLLM Proxy
docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.41.13
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
---|---|---|---|---|---|---|---|---|---|
/chat/completions | Passed ✅ | 140.0 | 155.5983915135419 | 6.414233412452233 | 0.0 | 1920 | 0 | 116.54234000002361 | 1700.940350000053 |
Aggregated | Passed ✅ | 140.0 | 155.5983915135419 | 6.414233412452233 | 0.0 | 1920 | 0 | 116.54234000002361 | 1700.940350000053 |