github BerriAI/litellm v1.29.7

latest releases: v1.47.1, v1.47.0, v1.46.8...
6 months ago

⚡️LiteLLM Proxy 100+ LLMs, Track Number of Requests, Avg Latency Per Model Deployment

model_latency

🛠️ High Traffic Fixes - Fix for DB connection limit hits when model fallbacks occur

🚀 High Traffic Fixes - /embedding - bug "Dictionary changed size during iteration"

⚡️ High Traffic Fixes - Switch off --detailed_debug in default Dockerfile. Users will need to opt in to viewing --detailed_debug logs. (This led to a 5% decrease in avg latency across 1K concurrent calls)

📖 Docs - Fixes for /user/new on LiteLLM Proxy Swagger (show how to set tpm/rpm limits per user) https://docs.litellm.ai/docs/proxy/virtual_keys#usernew

⭐️ Admin UI - separate latency, num requests graphs for model deployments https://docs.litellm.ai/docs/proxy/ui

What's Changed

Full Changelog: v1.29.4...v1.29.7

Don't miss a new litellm release

NewReleases is sending notifications on new releases.