github BerriAI/litellm v1.77.6.dev.1

4 hours ago

What's Changed

  • [Fix] Fix LiteLLM model name fallback in dashboard overview by @herve-ves in #14998
  • [Fix] Use the extra_query parameter for GET requests in Azure Batch by @eycjur in #14997
  • MCP - specify forwardable headers, specify allowed/disallowed tools for MCP servers by @krrishdholakia in #15002

New Contributors

Full Changelog: v1.77.5-nightly...v1.77.6.dev.1

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.77.6.dev.1

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Failed ❌ 62 81.92432634257806 6.505195152344288 6.505195152344288 1947 1947 43.96903999997903 3188.437694000015
Aggregated Failed ❌ 62 81.92432634257806 6.505195152344288 6.505195152344288 1947 1947 43.96903999997903 3188.437694000015

Don't miss a new litellm release

NewReleases is sending notifications on new releases.