Published 22 December 2025
Major Features
- ACP Integration: Introduce initial ACP (Agent Communication Protocol) integration to create ACP-compatible agents in Koog (#1253)
- Planner Agent Type: Introduce new "planner" agent type with iterative planning capabilities. Provide two out-of-the box strategies: simple LLM planner and GOAP (Goal-Oriented Action Planning) (#1232)
- Response Processor: Introduce
ResponseProcessorto fix tool call messages from weak models that fail to properly generate tool calls (KG-212, #871)
Improvements
- Event ID Propagation: Integrate event ID and execution info propagation across all pipeline events, agent execution flow, and features including Debugger and Tracing (KG-178)
- Bedrock Enhancements:
- Add fallback model support and warning mechanism for unsupported Bedrock models with custom families (KG-595, #1224)
- Add global inference profile prefix support to Bedrock models for improved availability and latency (#1139)
- Add Bedrock support in Ktor integration for configuring and initializing Bedrock LLM clients (#1141)
- Improve Bedrock moderation implementation with conditional guardrails API calls (#1105)
- Ollama: Add support for file attachments in Ollama client (#1221)
- Tool Schema: Add extension point for custom tool schemas to allow clients to provide custom schemas or modify existing ones (#1158)
- Google Client:
- HTTP Client: Make
KoogHttpClientauto-closable and addclientNameparameter (#1184) - Update MCP SDK version to 0.7.7 (#1154)
- Use SEQUENTIAL mode as default for
singleRunStrategy(#1195)
Bug Fixes
- Streaming: Fix streaming + tool call issues for Google and OpenRouter clients - Google client now passes tools parameter, OpenRouter uses CIO engine for SSE, improved SSE error handling (KG-616, #1262)
- Tool Calling: Fix
requestLLMOnlyCallingToolsignoring tool calls after reasoning messages from models with Chain of Thought (KG-545, #1198) - File Tools:
- Model-Specific Fixes:
- Pass
jsonObjectasresponseFormatfor DeepSeek to fix JSON mode (KG-537, #1258) - Remove
LLMCapability.Temperaturefrom GPT-5 model capabilities (#1277) - Fix OpenAI streaming with tools in Responses API (KG-584, #1255)
- Fix Bedrock timeout setting propagation to
BedrockRuntimeClient.HttpClient(#1190) - Add handler for
GooglePart.InlineDatato support binary content responses (KG-487, #1094)
- Pass
- Other Fixes:
- Fix reasoning message handling in provided simple strategies (#1166)
- Fix empty list condition check in
onMultipleToolResultsandonMultipleAssistantMessages(#1192) - Fix timeout not respected in executor because
join()was called before timeout check (#1005) - Fix
ContentPartsBuilderto flush whenevertextBuilderis not empty (KG-504, #1123) - Fix and simplify
McpToolto properly support updated Tool serialization (#1128) - Fix
OpenAIConfig.moderationsPathto be mutable (varinstead ofval) (#1097) - Finalize pipeline feature processors after agent run for
StatefulSingleUseAIAgent(KG-576)
Breaking Changes
- Persistence: Remove requirement for unique graph node names in Persistence feature, migrate to node path usage (#1288)
- Tool API: Update Tool API to fix name and descriptor discrepancy - moved configurable tool properties to constructors, removed
doExecutein favor ofexecute(KG-508, #1226) - OpenAI Models: GPT-5-Codex and GPT-5.1 reasoning models moved from Chat section to Reasoning section (KG-562, #1146)
- Structured Output: Rename structured output classes -
StructuredOutput→StructuredRequest,StructuredData→Structure,JsonStructuredData→JsonStructure(#1107) - Module Organization: Move
LLMChoicefromprompt-llmtoprompt-executor-modelmodule (#1109)