pypi litellm 1.59.5
v1.59.5

latest releases: 1.85.0.dev1, 1.84.0rc1, 1.84.0.dev2...
15 months ago

What's Changed

  • Deepseek r1 support + watsonx qa improvements by @krrishdholakia in #7907
  • (Testing) - Add e2e testing for langfuse logging with tags by @ishaan-jaff in #7922
  • build(deps): bump undici from 6.21.0 to 6.21.1 in /docs/my-website by @dependabot in #7902
  • (test) add e2e test for proxy with fallbacks + custom fallback message by @krrishdholakia in #7933
  • (feat) - add deepseek/deepseek-reasoner to model cost map by @ishaan-jaff in #7935
  • fix(utils.py): move adding custom logger callback to success event in… by @krrishdholakia in #7905
  • Add provider_specifc_header param by @krrishdholakia in #7932
  • (Refactor) Langfuse - remove prepare_metadata, langfuse python SDK now handles non-json serializable objects by @ishaan-jaff in #7925

Full Changelog: v1.59.3...v1.59.5

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.59.5

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 210.0 227.08635060543418 6.150672112760015 0.0 1840 0 180.76872099999264 2652.4827009999967
Aggregated Passed ✅ 210.0 227.08635060543418 6.150672112760015 0.0 1840 0 180.76872099999264 2652.4827009999967

Don't miss a new litellm release

NewReleases is sending notifications on new releases.