github BerriAI/litellm v1.38.7

latest releases: v1.41.3.dev3, v1.41.3.dev2, v1.41.3...
one month ago

😇 LiteLLM v1.38.7 - New Activity Tab, Track LLM API Requests & Total Tokens 👉 Start here: https://github.com/BerriAI/litellm

🔥 [Fix] Set budget_duration on /team/new and /team/update

🔥 [Feat] Supporting for Resetting Team Budgets on budget_reset_at https://docs.litellm.ai/docs/proxy/users

⚒️ [Feature]: Attach litellm exception in error string - ContentPolicyViolation, AuthenticationError

📧 [Docs]- setting up Email notifications https://docs.litellm.ai/docs/proxy/email

pika-1716692441692-1x

What's Changed

Full Changelog: v1.38.5...v1.38.7

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.38.7

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 110.0 127.76486134384693 6.465849619551454 0.0 1934 0 97.91651000000456 1353.8686059999918
Aggregated Passed ✅ 110.0 127.76486134384693 6.465849619551454 0.0 1934 0 97.91651000000456 1353.8686059999918

Don't miss a new litellm release

NewReleases is sending notifications on new releases.