pypi litellm 1.61.3
v1.61.3

latest releases: 1.84.0.dev2, 1.84.0.dev1, 1.83.14...
14 months ago

What's Changed

  • Improved wildcard route handling on /models and /model_group/info by @krrishdholakia in #8473
  • (Bug fix) - Using include_usage for /completions requests + unit testing by @ishaan-jaff in #8484
  • add sonar pricings by @themrzmaster in #8476
  • (bug fix) PerplexityChatConfig - track correct OpenAI compatible params by @ishaan-jaff in #8496
  • (fix #2) don't block proxy startup if license check fails & using prometheus by @ishaan-jaff in #8492
  • ci(config.yml): mark daily docker builds with -nightly by @krrishdholakia in #8499
  • (Redis Cluster) - Fixes for using redis cluster + pipeline by @ishaan-jaff in #8442
  • Litellm UI stable version 02 12 2025 by @krrishdholakia in #8497
  • fix: fix test by @krrishdholakia in #8501
  • enables no auth for SMTP by @krrishdholakia in #8494
  • UI Fixes p2 by @krrishdholakia in #8502
  • add phoenix docs for observability integration by @exiao in #8522
  • Added custom_attributes to additional_keys which can be sent to athina by @vivek-athina in #8518
  • (UI) fix log details page by @ishaan-jaff in #8524
  • Add UI Support for Admins to Call /cache/ping and View Cache Analytics (#8475) by @tahaali-dev in #8519
  • LiteLLM Improvements (02/13/2025) p1 by @krrishdholakia in #8523
  • fix(utils.py): fix vertex ai optional param handling by @krrishdholakia in #8477
  • Add 'prediction' param for Azure + Add gemini-2.0-pro-exp-02-05 vertex ai model to cost map + New bedrock/deepseek_r1/* route by @krrishdholakia in #8525
  • (UI) - Refactor View Key Table by @ishaan-jaff in #8526

Full Changelog: v1.61.1...v1.61.3

## Docker Run LiteLLM Proxy

```
docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.61.3

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

## Docker Run LiteLLM Proxy

```
docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.61.3

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Failed ❌ 110.0 127.51554087063036 6.408067444109619 6.408067444109619 1917 1917 94.95955199997752 2825.282969
Aggregated Failed ❌ 110.0 127.51554087063036 6.408067444109619 6.408067444109619 1917 1917 94.95955199997752 2825.282969

Don't miss a new litellm release

NewReleases is sending notifications on new releases.