What's Changed
- fix(vertex_httpx.py): support tool calling w/ streaming for vertex ai + gemini by @krrishdholakia in #4579
- fix(router.py): fix setting httpx mounts by @krrishdholakia in #4434
Full Changelog: v1.41.8.dev2...v1.41.11.dev1
Docker Run LiteLLM Proxy
docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.41.11.dev1
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
---|---|---|---|---|---|---|---|---|---|
/chat/completions | Passed ✅ | 77 | 88.81311274450754 | 6.537049007999015 | 0.0 | 1957 | 0 | 67.48113100002229 | 1314.0082060000395 |
Aggregated | Passed ✅ | 77 | 88.81311274450754 | 6.537049007999015 | 0.0 | 1957 | 0 | 67.48113100002229 | 1314.0082060000395 |