What's Changed
- fix(get_litellm_params.py): handle no-log being passed in via kwargs by @krrishdholakia in #8830
- fix(o_series_transformation.py): fix optional param check for o-serie… by @krrishdholakia in #8787
- chore: set ttlSecondsAfterFinished on the migration job in the litellm-helm chart by @ashwin153 in #8593
- Litellm dev bedrock anthropic 3 7 v2 by @krrishdholakia in #8843
- Mark Claude Haiku 3.5 as vision-capable by @minhduc0711 in #8840
- feat: enhance migrations job with additional configurable properties by @mknet3 in #8636
- (UI + Backend) Fix Adding Azure, Azure AI Studio models on LiteLLM by @ishaan-jaff in #8856
- fix caching on main branch by @krrishdholakia in #8858
- [Bug]: Deepseek error on proxy after upgrading to 1.61.13-stable by @ishaan-jaff in #8860
New Contributors
- @ashwin153 made their first contribution in #8593
- @mknet3 made their first contribution in #8636
Full Changelog: v1.61.17-nightly...v1.61.19-nightly
Docker Run LiteLLM Proxy
docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.61.19-nightly
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
---|---|---|---|---|---|---|---|---|---|
/chat/completions | Passed ✅ | 190.0 | 197.9691385267413 | 6.185555293319119 | 0.003341737057438746 | 1851 | 1 | 53.4934159999807 | 967.1408859999815 |
Aggregated | Passed ✅ | 190.0 | 197.9691385267413 | 6.185555293319119 | 0.003341737057438746 | 1851 | 1 | 53.4934159999807 | 967.1408859999815 |