github BerriAI/litellm 1.34.2

latest releases: v1.52.6, v1.52.5, v1.52.4...
7 months ago

What's Changed

  • Updating the default Anthropic Officlal Claude 3 max_tokens to 4096 by @Caixiaopig in #2855
  • add test for rate limits - Router isn't coroutine safe by @CLARKBENHAM in #2798
  • [integrations/langfuse] Use packaging over deprecated pkg_resources by @nicovank in #2844
  • [Feat] Text-Completion-OpenAI - Re-use OpenAI Client by @ishaan-jaff in #2877
  • re-use Azure OpenAI client for azure text completions by @ishaan-jaff in #2878

New Contributors

Full Changelog: v1.34.29...1.34.2

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 41 44.78511420041374 1.6166803681929298 0.0 484 0 35.75204999998505 621.7158040000186
/health/liveliness Passed ✅ 25 26.808032545890264 15.358463497832833 0.0 4598 0 23.034784999993008 367.92435199998863
/health/readiness Passed ✅ 25 26.462279976185147 15.70918961076725 0.0 4703 0 23.12188700000206 304.2405280000082
Aggregated Passed ✅ 25 27.531060975677224 32.684333476793014 0.0 9785 0 23.034784999993008 621.7158040000186

Don't miss a new litellm release

NewReleases is sending notifications on new releases.