github BerriAI/litellm v1.59.8

latest releases: v1.59.9, v1.59.8-dev1
2 days ago

What's Changed

  • refactor: cleanup dead codeblock by @krrishdholakia in #7936
  • add type annotation for litellm.api_base (#7980) by @krrishdholakia in #7994
  • (QA / testing) - Add unit testing for key model access checks by @ishaan-jaff in #7999
  • (Prometheus) - emit key budget metrics on startup by @ishaan-jaff in #8002
  • (Feat) set guardrails per team by @ishaan-jaff in #7993
  • Supported nested json schema on anthropic calls via proxy + fix langfuse sync sdk issues by @krrishdholakia in #8003
  • Bug fix - [Bug]: If you create a key tied to a user that does not belong to a team, and then edit the key to add it to a team (the user is still not a part of a team), using that key results in an unexpected error by @ishaan-jaff in #8008
  • (QA / testing) - Add e2e tests for key model access auth checks by @ishaan-jaff in #8000
  • (Fix) langfuse - setting LANGFUSE_FLUSH_INTERVAL by @ishaan-jaff in #8007

Full Changelog: v1.59.7...v1.59.8

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.59.8

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Failed ❌ 280.0 325.48398318207154 6.003526201462839 0.0 1796 0 234.56590200004257 3690.442290999954
Aggregated Failed ❌ 280.0 325.48398318207154 6.003526201462839 0.0 1796 0 234.56590200004257 3690.442290999954

Don't miss a new litellm release

NewReleases is sending notifications on new releases.