github BerriAI/litellm v1.44.9

latest releases: v1.52.5, v1.52.4, v1.52.0-stable...
2 months ago

Launching support for using LiteLLM LLM Gateway with Oauth2 proxy authentication

🛠️ Security Fix - don't allow clients to set api_base, base_url for requests to LiteLLM Proxy
🔥 Use ssml input for Vertex Text to Speech APIs: https://docs.litellm.ai/docs/providers/vertex#usage---ssml-as-input

⚡️ [UI] Allow viewing / editing budget duration

🛠️ [minor fix Proxy] - prometheus fix use safe update start / end time
Group 5997

What's Changed

New Contributors

Full Changelog: v1.44.8...v1.44.9

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.44.9

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 140.0 159.56966948018703 6.410714219145582 0.0 1918 0 110.58400799998935 1685.0977490000218
Aggregated Passed ✅ 140.0 159.56966948018703 6.410714219145582 0.0 1918 0 110.58400799998935 1685.0977490000218

Don't miss a new litellm release

NewReleases is sending notifications on new releases.