What's Changed
- Allow end-users to opt out of llm api calls by @krrishdholakia in #2174
- [Docs] open router - clarify we support all models by @ishaan-jaff in #2186
- (docs) using openai compatible endpoints by @ishaan-jaff in #2189
- [Fix] Fix health check when API base set for OpenAI compatible models by @ishaan-jaff in #2188
- fix(proxy_server.py): allow user to set team tpm/rpm limits/budget/models by @krrishdholakia in #2183
Full Changelog: v1.27.1...v1.27.4