github BerriAI/litellm v1.18.6

latest releases: v1.48.6, v1.48.5.dev1, v1.48.5-stable...
8 months ago

What's Changed

1.[Feat] litellm.acompletion() make Langfuse success handler non blocking by @ishaan-jaff in #1519

  • The Langfuse Success Callback was blocking running litellm.acompletion() calls. fixed on this release
  • Support for logging Cache Hits on Langfuse:
    support for tagging cache_hits on Langfuse
    (note you need langfuse>=2.6.3
Screenshot 2024-01-19 at 11 36 47 AM

2. Langsmith: Add envs for project/run names; fix bug with None metadata by @timothyasp in #1524

[Feat] Router improvements by @ishaan-jaff in #1525

3. Allow overriding headers for anthropic by @keeganmccallum in #1513

  • fix(utils.py): add metadata to logging obj on setup, if exists (fixes max parallel request bug) by @krrishdholakia in #1531

4. test(tests/): add unit testing for proxy server endpoints by @krrishdholakia in f5ced08

New Contributors

Full Changelog: v1.18.5...v1.18.6

Don't miss a new litellm release

NewReleases is sending notifications on new releases.