[BETA] Thrilled to launch support for Cohere/Command-R on LiteLLM , LiteLLM Proxy Server 👉 Start here https://docs.litellm.ai/docs/providers/cohere
☎️ PR for using cohere tool calling in OpenAI format: #2479
⚡️ LiteLLM Proxy + @langfuse - High Traffic - support 80+/Requests per second with Proxy + Langfuse logging https://docs.litellm.ai/docs/proxy/logging
⚡️ New Models - Azure GPT-Instruct models https://docs.litellm.ai/docs/providers/azure#azure-instruct-models
🛠️ Fix for using DynamoDB + LiteLLM Virtual Keys
What's Changed
- (feat) support azure/gpt-instruct models by @ishaan-jaff in #2471
- [New-Model] Cohere/command-r by @ishaan-jaff in #2474
- (fix) patch dynamoDB team_model_alias bug by @ishaan-jaff in #2478
- fix(azure.py): support cost tracking for azure/dall-e-3 by @krrishdholakia in #2475
- fix(openai.py): return model name with custom llm provider for openai-compatible endpoints (e.g. mistral, together ai, etc.) by @krrishdholakia in #2473
Full Changelog: v1.30.2...v1.31.4