github BerriAI/litellm v1.37.19-stable

latest releases: v1.51.3.dev10, v1.51.3-dev2, v1.51.3-dev1...
5 months ago

🚨 SSO on LiteLLM Proxy will be enforced behind a license from this release

What's Changed

  • [Fix] only run check_request_disconnection logic for maximum 10 mins by @ishaan-jaff in #3741
  • Adding decoding of base64 image data for gemini pro 1.5 by @hmcp22 in #3711
  • [Feat] Enforce user has a valid license when using SSO on LiteLLM Proxy by @ishaan-jaff in #3742
  • [FEAT] Async VertexAI Image Generation by @ishaan-jaff in #3739
  • [Feat] Router/ Proxy - set cooldown_time based on Azure exception headers by @ishaan-jaff in #3716
  • fix divide by 0 bug on slack alerting by @ishaan-jaff in #3745
  • Standardize slack exception msg format by @ishaan-jaff in #3747
  • Another dictionary changed size during iteration error by @phact in #3657
  • feat(proxy_server.py): allow admin to return rejected response as string to user by @krrishdholakia in #3740
  • [Fix] - raise 404 from /team/info when team does not exist by @ishaan-jaff in #3749
  • webhook support for budget alerts by @krrishdholakia in #3748
  • [Fix] - raise Exception when trying to update/delete a non-existent team by @ishaan-jaff in #3750
  • [FEAT] - add litellm.Router - abatch_completion_one_model_multiple_requests by @ishaan-jaff in #3751

New Contributors

Full Changelog: v1.37.17...v1.37.19-stable

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.37.19-stable

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Don't miss a new litellm release

NewReleases is sending notifications on new releases.