github BerriAI/litellm v1.7.11

latest releases: v1.46.0.dev1, v1.46.0, v1.45.0.dev2...
9 months ago

💥 LiteLLM Router + Proxy handles 500+ requests/second

💥LiteLLM Proxy - Now handles 500+ requests/second, Load Balance Azure + OpenAI deployments, Track spend per user 💥
Try it here: https://docs.litellm.ai/docs/simple_proxy
🔑 Support for AZURE_OPENAI_API_KEY on Azure https://docs.litellm.ai/docs/providers/azure
h/t
@solyarisoftware
⚡️ LiteLLM Router can now handle 20% more throughput https://docs.litellm.ai/docs/routing
📖Improvement to litellm debugging docs h/t
@solyarisoftware
https://docs.litellm.ai/docs/debugging/local_debugging

Full Changelog: v1.7.1...v1.7.11

Don't miss a new litellm release

NewReleases is sending notifications on new releases.