What's Changed
- (fix) get_response_headers for Azure OpenAI by @ishaan-jaff in #6344
- fix(litellm-helm): correctly use dbReadyImage and dbReadyTag values by @Hexoplon in #6336
- fix(proxy_server.py): add 'admin' user to db by @krrishdholakia in #6223
- refactor(redis_cache.py): use a default cache value when writing to r… by @krrishdholakia in #6358
- LiteLLM Minor Fixes & Improvements (10/21/2024) by @krrishdholakia in #6352
New Contributors
Full Changelog: v1.50.1...v1.50.1.dev1
Docker Run LiteLLM Proxy
docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.50.1.dev1
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
---|---|---|---|---|---|---|---|---|---|
/chat/completions | Passed ✅ | 200.0 | 220.3880747854055 | 6.181213384368117 | 0.0 | 1850 | 0 | 179.4118180000055 | 2854.2284040000254 |
Aggregated | Passed ✅ | 200.0 | 220.3880747854055 | 6.181213384368117 | 0.0 | 1850 | 0 | 179.4118180000055 | 2854.2284040000254 |