What's Changed
- Add Llama 3.1 for Bedrock by @Manouchehri in #4848
- (test_embedding.py) - Re-enable embedding test with Azure OIDC. by @Manouchehri in #4857
- [Feat] - Support Logging tags on langsmith by @ishaan-jaff in #4853
- [Fix-litellm python] Raise correct error for UnsupportedParams Error by @ishaan-jaff in #4862
- doc example using litellm proxy with groq by @ishaan-jaff in #4864
- feat: add support for friendliai dedicated endpoint by @pocca2048 in #4638
- [Feat] Add Groq/llama3.1 by @ishaan-jaff in #4871
Full Changelog: v1.42.0...v1.42.1
Docker Run LiteLLM Proxy
docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.42.1
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Docker Run LiteLLM Proxy
docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.42.1
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
---|---|---|---|---|---|---|---|---|---|
/chat/completions | Passed ✅ | 110.0 | 131.39329430193612 | 6.3862073107191915 | 0.0 | 1911 | 0 | 96.32532500006619 | 1137.7997399999913 |
Aggregated | Passed ✅ | 110.0 | 131.39329430193612 | 6.3862073107191915 | 0.0 | 1911 | 0 | 96.32532500006619 | 1137.7997399999913 |