github BerriAI/litellm v1.37.11

latest releases: v1.41.3.dev3, v1.41.3.dev2, v1.41.3...
one month ago

What's Changed

  • feat(proxy_server.py): Enabling Admin to control general settings on proxy ui by @krrishdholakia in #3660
  • [Fix] Mask API Keys from Predibase AuthenticationErrors by @ishaan-jaff in #3662
  • [FIX] raise alerts for exceptions on /completions endpoint by @ishaan-jaff in #3661
  • Updated Ollama cost models to include LLaMa3 and Mistral/Mixtral Instruct series by @kmheckel in #3543

New Contributors

Full Changelog: v1.37.10...v1.37.11

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.37.11

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.37.11

Don't miss a new litellm release

NewReleases is sending notifications on new releases.