github BerriAI/litellm v1.39.6

latest releases: v1.41.2-stable, v1.41.2, v1.41.1...
one month ago

We're launching team member invites (No SSO Required) on v1.39.6 🔥 Invite team member to view LLM Usage, Spend per service https://docs.litellm.ai/docs/proxy/ui

👍 [Fix] Cache Vertex AI clients - Major Perf improvement for VertexAI models

✨ Feat - Send new users invite emails on creation (using 'send_invite_email' on /user/new)

💻 UI - allow users to sign in with with email/password

🔓 [UI] Admin UI Invite Links for non SSO

✨ PR - [FEAT] Perf improvements - litellm.completion / litellm.acompletion - Cache OpenAI client
inviting_members_ui

What's Changed

New Contributors

Full Changelog: v1.39.5...v1.39.6

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.39.6

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 78 90.37559010674164 6.5521693586672445 0.0 1958 0 65.34477100001368 961.3953589999937
Aggregated Passed ✅ 78 90.37559010674164 6.5521693586672445 0.0 1958 0 65.34477100001368 961.3953589999937

Don't miss a new litellm release

NewReleases is sending notifications on new releases.