Published 27 Aug 2025
Major Features
- Integration with Observability Tools:
- Ktor Integration: First-class Ktor support via the "Koog" Ktor plugin to register and run agents in Ktor applications (#422).
- iOS Target Support: Multiplatform expanded with native iOS targets, enabling agents to run on Apple platforms (#512).
- Upgraded Structured Output: Refactored structured output API to be more flexible and add built-in/native provider support for OpenAI and Google, reducing prompt boilerplate and improving validation (#443).
- GPT5 and Custom LLM Parameters Support: Now GPT5 is available together with custom additional LLM parameters for OpenAI-compatible clients (#631, #517)
- Resilience and Retries:
Improvements
- OpenTelemetry and Observability:
- Finish reason and unified attributes for inference/tool/message spans and events; extract event body fields to attributes for better querying (KG-218).
- Mask sensitive data in events/attributes and introduce a “hidden-by-default” string type to keep secrets safe in logs (KG-259).
- Include all messages into the inference span and add an index for ChoiceEvent to simplify analysis (KG-172).
- Add tool arguments to
gen_ai.choiceandgen_ai.assistant.messageevents (#462). - Allow setting a custom OpenTelemetry SDK instance in Koog (KG-169).
- LLM and Providers:
- Support Google’s “thinking” mode in generation config to improve reasoning quality (#414).
- Add responses API support for OpenAI (#645)
- AWS Bedrock: support Inference Profiles for simpler, consistent configuration (#506) and accept
AWS_SESSION_TOKEN(#456). - Add
maxTokensas prompt parameters for finer control over generation length (#579). - Add
contextLengthandmaxOutputTokenstoLLModel(#438, KG-134)
- Agent Engine:
- File Tools and RAG:
- Reworked FileSystemProvider with API cleanups and better ergonomics; moved blocking/suspendable operations to
Dispatchers.IOfor improved performance and responsiveness (#557, “Move suspendable operations to Dispatchers.IO”). - Introduce
filterByRoothelpers and allow custom path filters inFilteredFileSystemProviderfor safer agent sandboxes (#494, #508). - Rename
PathFiltertoTraversalFilterand make its methods suspendable to support async checks. - Rename
fromAbsoluteStringtofromAbsolutePathStringfor clarity (#567). - Add
ReadFileToolfor reading local file contents where appropriate (#628).
- Reworked FileSystemProvider with API cleanups and better ergonomics; moved blocking/suspendable operations to
- Update kotlin-mcp dependency to v0.6.0 (#523)
Bug Fixes
- Make
partsfield nullable in Google responses to handle missing content from Gemini models (#652). - Fix enum parsing in MCP when type is not mentioned (#601, KG-49)
- Fix function calling for
gemini-2.5-flashmodels to correctly route tool invocations (#586). - Restore OpenAI
responseFormatoption support in requests (#643). - Correct
o4-minivsgpt-4o-minimodel mix-up in configuration (#573). - Ensure event body for function calls is valid JSON for telemetry ingestion (KG-268).
- Fix duplicated tool names resolution in
AIAgentSubgraphExtto prevent conflicts (#493). - Fix Azure OpenAI client settings to generate valid endpoint URLs (#478).
- Restore
llama3.2:latestas the default for LLAMA_3_2 to match the provider expectations (#522). - Update missing
Documentcapabilities for LLModel (#543) - Fix Anthropic json schema validation error (#457)
Removals / Breaking Changes
- Remove Google Gemini 1.5 Flash/Pro variants from the catalog (KG-216, #574).
- Drop
executeextensions forPromptExecutorin favor of the unified API (#591). - File system API cleanup: removed deprecated FSProvider interfaces and methods;
PathFilterrenamed toTraversalFilterwith suspendable operations;fromAbsoluteStringrenamed tofromAbsolutePathString.