github BerriAI/litellm v1.52.14

6 hours ago

What's Changed

  • (fix) passthrough - allow internal users to access /anthropic by @ishaan-jaff in #6843
  • LiteLLM Minor Fixes & Improvements (11/21/2024) by @krrishdholakia in #6837
  • fix latency issues on google ai studio by @ishaan-jaff in #6852
  • (fix) add linting check to ban creating AsyncHTTPHandler during LLM calling by @ishaan-jaff in #6855
  • (feat) Add usage tracking for streaming /anthropic passthrough routes by @ishaan-jaff in #6842
  • (Feat) Allow passing litellm_metadata to pass through endpoints + Add e2e tests for /anthropic/ usage tracking by @ishaan-jaff in #6864

Full Changelog: v1.52.12...v1.52.14

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.52.14

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 260.0 292.32742033908687 6.002121672811824 0.0 1796 0 222.04342999998516 2700.951708000048
Aggregated Passed ✅ 260.0 292.32742033908687 6.002121672811824 0.0 1796 0 222.04342999998516 2700.951708000048

Don't miss a new litellm release

NewReleases is sending notifications on new releases.