github BerriAI/litellm v1.22.5

latest releases: v1.47.1, v1.47.0, v1.46.8...
7 months ago

What's Changed

  • Re-raise exception in async ollama streaming by @vanpelt in #1750
  • Add a Helm chart for deploying LiteLLM Proxy by @ShaunMaher in #1602
  • Update Perplexity models in model_prices_and_context_window.json by @toniengelhardt in #1826
  • (feat) Add sessionId for Langfuse. by @Manouchehri in #1828
  • [Feat] Sync model_prices_and_context_window.json and litellm/model_prices_and_context_window_backup.json by @ishaan-jaff in #1834

New Contributors

Full Changelog: v1.22.3...v1.22.5

Don't miss a new litellm release

NewReleases is sending notifications on new releases.