What's Changed
- [FIX] BUG where extra tokens created in litellm verification token table by @ishaan-jaff in #2150
- Support for Athina logging by @vivek-athina in #2163
- [FEAT] Support extra headers - OpenAI / Azure by @ishaan-jaff in #2164
- [FEAT] Support Groq AI by @ishaan-jaff in #2168
Sample Usage
from litellm import completion
import os
os.environ['GROQ_API_KEY'] = ""
response = completion(
model="groq/llama2-70b-4096",
messages=[
{"role": "user", "content": "hello from litellm"}
],
)
print(response)
New Contributors
- @vivek-athina made their first contribution in #2163
Full Changelog: v1.26.10...v1.26.13