github BerriAI/litellm v1.73.1-nightly

latest releases: v1.76.2-nightly, v1.76.1.rc.2, v1.76.1.rc.1...
2 months ago

What's Changed

  • Fix SambaNova 'created' field validation error - handle float timestamps by @neubig in #11971
  • Docs - Add Recommended Machine Specifications by @ishaan-jaff in #11980
  • fix: make response api support Azure Authentication method by @hsuyuming in #11941
  • feat: add Last Success column to health check table by @colesmcintosh in #11903
  • Add GitHub Actions workflow for LLM translation testing artifacts by @colesmcintosh in #11780
  • Fix markdown table not rendering properly by @mukesh-dream11 in #11969
  • [Fix] - Check HTTP_PROXY vars in networking requests by @ishaan-jaff in #11947
  • Proxy UI MCP Auth passthrough by @wagnerjt in #11968
  • fix unrecognised parameter reasoning_effort by @Shankyg in #11838
  • Fixing watsonx error: 'model_id' or 'model' cannot be specified in the request body for models in a deployment space by @cbjuan in #11854
  • [Bug Fix] Perplexity - LiteLLM doesn't support 'web_search_options' for Perplexity' Sonar Pro model by @ishaan-jaff in #11983
  • feat: implement Perplexity citation tokens and search queries cost calculation by @colesmcintosh in #11938
  • [Feat] Enterprise - Allow dynamically disabling callbacks in request headers by @ishaan-jaff in #11985
  • Add Mistral 3.2 24B to model mapping by @colesmcintosh in #11926
  • [Feat] Add List Callbacks API Endpoint by @ishaan-jaff in #11987
  • fix: fix test_get_azure_ad_token_with_oidc_token testcase issue by @hsuyuming in #11988
  • [Bug Fix] Bedrock Guardrail - Don't raise exception on intervene action by @ishaan-jaff in #11875

New Contributors

Full Changelog: v1.73.0.rc.1...v1.73.1-nightly

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.73.1-nightly

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.73.1-nightly

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 250.0 269.80153099125215 6.123419901585826 0.0 1829 0 217.6905329999954 1336.1768169999948
Aggregated Passed ✅ 250.0 269.80153099125215 6.123419901585826 0.0 1829 0 217.6905329999954 1336.1768169999948

Don't miss a new litellm release

NewReleases is sending notifications on new releases.