github BerriAI/litellm v1.73.0.debug_mem

latest releases: v1.76.2-nightly, v1.76.1.rc.2, v1.76.1.rc.1...
2 months ago

Full Changelog: v1.73.0-stable...v1.73.0.debug_mem

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.73.0.debug_mem

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 230.0 248.77890883802746 6.167721429610798 0.0 1846 0 196.06082799998603 1224.1804999999886
Aggregated Passed ✅ 230.0 248.77890883802746 6.167721429610798 0.0 1846 0 196.06082799998603 1224.1804999999886

Don't miss a new litellm release

NewReleases is sending notifications on new releases.