github BerriAI/litellm v1.68.1-nightly

4 months ago

What's Changed

  • Add bedrock llama4 pricing + handle llama4 templating on bedrock invoke route by @krrishdholakia in #10582

Full Changelog: v1.68.1.dev2...v1.68.1-nightly

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.68.1-nightly

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 210.0 234.26202141345593 6.161378945167915 0.0 1843 0 179.4365540000058 3332.6730800000064
Aggregated Passed ✅ 210.0 234.26202141345593 6.161378945167915 0.0 1843 0 179.4365540000058 3332.6730800000064

Don't miss a new litellm release

NewReleases is sending notifications on new releases.