github BerriAI/litellm v1.76.3.dev1

latest release: v1.76.1-stable
9 hours ago

What's Changed

  • Doc Updates 9-5-25 by @TeddyAmkie in #14299
  • docs(index.md): initial commit with release notes by @krrishdholakia in #13931
  • [Fix] Perf fix: Heavy RAM Usage over time when using Passthrough Routes by @ishaan-jaff in #14305
  • MINOR update: Add openrouter image generation support + refactor Gemini image output param to be images (openrouter compatible) by @krrishdholakia in #14160
  • docs: moved custom spend tags by @mubashir1osmani in #14308
  • Modify cryptography dependency to latest by @c3-AndrewDoan in #13947
  • feat(helm): Allow no DATABASE_URL to be set on migration job to keep the behaviour same as deployment by @edify42 in #13855
  • (Not fully tested, LLM-generated code) fix issue where vertex ai fails to use new credentials after token expiration plus gcloud auth login --update-adc by @ozzieba in #13092
  • honor OLLAMA_API_KEY for ollama_chat by @darashenka in #12984
  • Heroku llms by @tlowrimore-heroku in #12992
  • Fix 500 error in /customer/update endpoint when updating with budget_id by @jasonpnnl in #12438
  • fix bedrock embedding invocations with app inference profiles by @btemplep in #9902
  • Security fix - prevent proxy_admin_viewer from modifying other user's credentials + remove hardcoded sensitive keys from test repo by @krrishdholakia in #14161
  • [Performance] Use executors in post-logging hooks by @Bobronium in #14332
  • Fix markdown formatting issues in Docker quick start documentation by @zheng1 in #14322
  • Docs: Changed the model field in the response format to open AI. by @boopesh07 in #14354
  • [Security] Ensure LiteLLM Images have 0 Critical, High, Medium vulnerabilities with CVSS ≥ 4.0 by @ishaan-jaff in #14357
  • Revert "Modify cryptography dependency to latest" by @ishaan-jaff in #14358

New Contributors

Full Changelog: v1.76.3-nightly...v1.76.3.dev1

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.76.3.dev1

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Failed ❌ 120.0 130.90050454300567 6.457292699666456 6.457292699666456 1930 1930 96.1942019999924 442.00646900000606
Aggregated Failed ❌ 120.0 130.90050454300567 6.457292699666456 6.457292699666456 1930 1930 96.1942019999924 442.00646900000606

Don't miss a new litellm release

NewReleases is sending notifications on new releases.