github BerriAI/litellm v1.21.5

latest releases: v1.48.18, v1.48.17-stable, v1.48.17...
8 months ago

What's Changed

⭐️ [Feat] Show correct provider in exceptions - for Mistral API, PerplexityAPI, Anyscale, XInference by @ishaan-jaff in #1765, #1776

(Thanks @dhruv-anand-aintech for the issue/help)
Exceptions for Mistral API, PerplexityAPI, Anyscale, XInference now show the correct provider name, before they would show OPENAI_API_KEY is missing when using PerplexityAI

exception:  PerplexityException - Traceback (most recent call last):
  File "/Users/ishaanjaffer/Github/litellm/litellm/llms/perplexity.py", line 349, in completion
    raise e
  File "/Users/ishaanjaffer/Github/litellm/litellm/llms/perplexity.py", line 292, in completion
    perplexity_client = perplexity(
  File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/perplexity/_client.py", line 98, in __init__
    raise perplexityError(
perplexity.perplexityError: The api_key client option must be set either by passing api_key to the client or by setting the PERPLEXITY_API_KEY environment variable

Full Changelog: v1.21.4...v1.21.5

Don't miss a new litellm release

NewReleases is sending notifications on new releases.