github BerriAI/litellm v1.19.2

latest releases: v1.47.2.dev1, v1.47.2, v1.47.1...
8 months ago

What's Changed

  • [Feat] Add cache_key in SpendLogs Table by @ishaan-jaff in #1604
  • [Feat] Proxy - Spend Tracking, Set Global Budget For the Proxy + Reset the budget by @krrishdholakia in #1603
  • [Fix] fix(ollama_chat.py): fix default token counting for ollama chat

Full Changelog: v1.19.0...v1.19.2

Don't miss a new litellm release

NewReleases is sending notifications on new releases.