github BerriAI/litellm v1.52.16.dev1

latest release: v1.52.16.dev4
3 hours ago

What's Changed

  • LiteLLM Minor Fixes & Improvements (11/24/2024) by @krrishdholakia in #6890
  • (feat) pass through llm endpoints - add PATCH support (vertex context caching requires for update ops) by @ishaan-jaff in #6924
  • sonnet supports pdf, haiku does not by @paul-gauthier in #6928
  • (feat) DataDog Logger - Add Failure logging + use Standard Logging payload by @ishaan-jaff in #6929
  • (feat) log proxy auth errors on datadog by @ishaan-jaff in #6931
  • (feat) Allow using include to include external YAML files in a config.yaml by @ishaan-jaff in #6922
  • (feat) dd logger - set tags according to the values set by those env vars by @ishaan-jaff in #6933

Full Changelog: v1.52.16...v1.52.16.dev1

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.52.16.dev1

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.52.16.dev1

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 250.0 285.0974372649336 6.039486955708498 0.0 1808 0 224.19419400000606 3263.23956899995
Aggregated Passed ✅ 250.0 285.0974372649336 6.039486955708498 0.0 1808 0 224.19419400000606 3263.23956899995

Don't miss a new litellm release

NewReleases is sending notifications on new releases.