1.15.0 (2025-09-24)
Features
- [Core]
- [Context Caching]
- Support context caching (c66245a)
-
Support explicit context caching auto creation and lifecycle management.
Usage:
App(root_agent=..., plugins=..., context_cache_config=...)
-
- Support non-text content in static instruction (61213ce)
- Support static instructions (9be9cc2)
-
Support static instruction that won't change, put at the beginning of
the instruction.
Static instruction support inline_data and file_data as contents.
Dynamic instruction moved to the end of LlmRequest, increasing prefix
caching matching size.Usage:
LlmAgent(model=...,static_instruction =types.Content(parts=...), ... )
-
- Support context caching (c66245a)
- [Telemetry]
- [Services]
- Add endpoint to generate memory from session (2595824)
- [Tools]
- [Evals]
- [Samples]
- Make the bigquery sample agent run with ADC out-of-the-box (10cf377)
Bug Fixes
- Close runners after running eval (86ee6e3)
- Filter out thought parts when saving agent output to state (632bf8b)
- Ignore empty function chunk in LiteLlm streaming response (8a92fd1)
- Introduces a
raw_mcp_tool
method inMcpTool
to provide direct access to the underlying MCP tool (6158075) - Make a copy of the
columns
instead of modifying it in place (aef1ee9) - Prevent escaping of Latin characters in LLM response (c9ea80a)
- Retain the consumers and transport registry when recreating the ClientFactory in remote_a2a_agent.py (6bd33e1)
- Remove unsupported 'type': 'unknown' in test_common.py for fastapi 0.117.1 (3745221)
Documentation
- Correct the documentation of
after_agent_callback
(b9735b2)