github BerriAI/litellm v1.57.0

latest releases: v1.57.1.dev1, v1.57.1, v1.57.0-dev1...
2 days ago

What's Changed

  • (Fix) make sure init custom loggers is non blocking by @ishaan-jaff in #7554
  • (Feat) Hashicorp Secret Manager - Allow storing virtual keys in secret manager by @ishaan-jaff in #7549
  • Create and view organizations + assign org admins on the Proxy UI by @krrishdholakia in #7557
  • (perf) fix [PROXY] don't use f string in add_litellm_data_to_request() by @ishaan-jaff in #7558
  • fix(groq/chat/transformation.py): fix groq response_format transforma… by @krrishdholakia in #7565
  • Support deleting keys by key_alias by @krrishdholakia in #7552
  • (proxy perf improvement) - use asyncio.create_task for service_logger_obj.async_service_success_hook in pre_call by @ishaan-jaff in #7563
  • add fireworks_ai/accounts/fireworks/models/deepseek-v3 by @Fredy in #7567
  • FriendliAI: Documentation Updates by @minpeter in #7517
  • Prevent istio injection for db migrations cron job by @lowjiansheng in #7513

New Contributors

Full Changelog: v1.56.10...v1.57.0

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.57.0

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 200.0 212.84027329611826 6.1961289027318704 0.0 1854 0 174.45147399996586 1346.3216149999653
Aggregated Passed ✅ 200.0 212.84027329611826 6.1961289027318704 0.0 1854 0 174.45147399996586 1346.3216149999653

Don't miss a new litellm release

NewReleases is sending notifications on new releases.