Core Release v1.5.3
- feat: claude-opus-4-7 compatibility with adaptive thinking, task-budgets beta header, display parameter handling, and xhigh effort mapping
- feat: add Anthropic structured output and response_format support across chat completions and Responses API (thanks @emirhanmutlu-natuvion!)
- feat: preserve MCP tool annotations in bidirectional conversion between MCP tools and Bifrost chat tools
- feat: expand Anthropic chat schema and Responses converters to surface server-side tools (web search, code execution, computer use containers)
- feat: add OCR request type support with stream terminal detection and full body accumulation for passthrough streams
- feat: add user agent detection for multiple user agents and fix tool call reduplication
- feat: virtual key provider and MCP configs are now deny-by-default; empty configs block all access
- fix: make provider config weight optional; null weight excludes provider from weighted routing
- fix: use separate streaming clients per provider to prevent read timeouts
- fix: concurrent map access in plugin timer
- fix: extra params passthrough for vllm providers
- fix: remove redundant static header assignment in tool execution
- fix: add OCR request pricing support
- fix: usage of per-user OAuth servers in codemode
- fix: adds validation on direct api keys
- fix: OpenAI provider - flatten array-form tool_result output for Responses API (thanks @martingiguere!)
- fix: Gemini provider - handle content block tool outputs in Responses API path (thanks @tom-diacono!)
- fix: case-insensitive anthropic-beta merge in MergeBetaHeaders
- fix: Bedrock provider - emit message_stop event for Anthropic invoke stream (thanks @tefimov!)
- fix: Bedrock provider - preserve image content in tool results for Converse API (thanks @Edward-Upton!)
- fix: gemini preserves thinkingLevel parameters during round-trip and corrects finish reason mapping
- fix: WebSearch tool argument handling for all clients by removing Claude Code user agent restriction
- fix: capture responses streaming API errors
- fix: delete fallbacks from outgoing Anthropic requests
- fix: token usage for vllm streaming (treat delta.content="" same as nil)
- fix: provider queue shutdown panic - eliminated send on closed channel by leaving channels open and exiting via done signal
Installation
go get github.com/maximhq/bifrost/core@v1.5.3This release was automatically created from version file: core/version