github BerriAI/litellm v1.37.3

latest releases: v1.52.8, v1.52.5-stable, v1.52.6.dev1...
6 months ago

BETA support for Triton Inference Embeddings on 👉 Start here: https://docs.litellm.ai/docs/providers/triton-inference-server

⚡️ [Feat] Use Team based callbacks for failure_callbacks https://docs.litellm.ai/docs/proxy/team_based_routing#logging--caching

🛠️ [Test] Added Testing to ensure Proxy - uses the same OpenAI Client after 1 min

🛠️ [Fix] Upsert deployment bug on LiteLLM Proxy

🔥 Improved LiteLLM-stable load tests - added testing for Azure OpenAI, and using 50+ deployments on a proxy server

🚀 [Feat] support stream_options on litellm.text_completion

codeimage-snippet_11 (3)

What's Changed

Full Changelog: v1.37.2...v1.37.3

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.37.3

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Don't miss a new litellm release

NewReleases is sending notifications on new releases.