What's Changed
- Upgrade Traceloop to version 0.18.2 by @elisalimli in #3727
- usage-based-routing-ttl-on-cache by @sumanth13131 in #3412
- Revert "Revert "Logfire Integration"" by @elisalimli in #3756
- docs - add bedrock meta llama3 by @ishaan-jaff in #3763
- [Cohere] Add request source to request by @BeatrixCohere in #3759
- [Fix] Bump OpenAI version on Litellm PIP package [OpenAI>=1.27.0] by @ishaan-jaff in #3765
- Support anthropic 'tool_choice' param by @krrishdholakia in #3771
- [Feat] Proxy - Create Keys that can only access
/spend
routes on Admin UI by @ishaan-jaff in #3772 - feat(lowest_latency.py): route by time to first token, for streaming requests (if available) by @krrishdholakia in #3768
- feat(router.py): filter out deployments which don't support request params w/ 'pre_call_checks=True' by @krrishdholakia in #3770
New Contributors
- @BeatrixCohere made their first contribution in #3759
Full Changelog: v1.37.19...v1.37.20
Docker Run LiteLLM Proxy
docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.37.20
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat