github BerriAI/litellm v1.72.5.dev1

3 months ago

What's Changed

  • fix(internal_user_endpoints.py): support user with + in email on user info + handle empty string for arguments on gemini function calls by @krrishdholakia in #11601
  • Fix: passes api_base, api_key, litellm_params_dict to custom_llm embedding methods by @ElefHead in #11450
  • Add Admin-Initiated Password Reset Flow by @NANDINI-star in #11618

New Contributors

Full Changelog: v1.72.4-nightly...v1.72.5.dev1

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.72.5.dev1

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 250.0 271.77221555459084 6.153062151618842 0.0 1841 0 218.69335899998532 1399.0517459999978
Aggregated Passed ✅ 250.0 271.77221555459084 6.153062151618842 0.0 1841 0 218.69335899998532 1399.0517459999978

Don't miss a new litellm release

NewReleases is sending notifications on new releases.