What's Changed
- Added a guide for users who want to use LiteLLM with AI/ML API. by @waterstark in #7058
- Added compatibility guidance, etc. for xAI Grok model by @zhaohan-dong in #8282
- (Security fix) - remove code block that inserts master key hash into DB by @ishaan-jaff in #8268
- (UI) - Add Assembly AI provider to UI by @ishaan-jaff in #8297
- (feat) - Add Assembly AI to model cost map by @ishaan-jaff in #8298
- fixed issues #8126 and #8127 (#8275) by @ishaan-jaff in #8299
- (Refactor) - migrate bedrock invoke to
BaseLLMHTTPHandler
class by @ishaan-jaff in #8290
New Contributors
- @waterstark made their first contribution in #7058
Full Changelog: v1.60.4...v1.60.5
Docker Run LiteLLM Proxy
docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.60.5
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
---|---|---|---|---|---|---|---|---|---|
/chat/completions | Passed ✅ | 210.0 | 251.44053604962153 | 6.19421782055854 | 0.0 | 1854 | 0 | 167.35073600000305 | 4496.06190000003 |
Aggregated | Passed ✅ | 210.0 | 251.44053604962153 | 6.19421782055854 | 0.0 | 1854 | 0 | 167.35073600000305 | 4496.06190000003 |