Key Changes
Any-LLM extension
Since this version, the extension module includes a new any-llm adapter as well. Please refer to the updated documentation and examples at https://github.com/openai/openai-agents-python/tree/main/examples/model_providers
What's Changed
- feat: add any-llm model support with responses-compatible routing by @seratch in #2706
- fix: preserve static MCP meta in converted function tools by @seratch in #2769
- fix: #2760 wait for realtime response.done before follow-up response.create by @seratch in #2763
- fix: handle cancelled single function tools as tool failures by @elainegan-openai in #2762
- fix: optionize initialized notification tolerance by @elainegan-openai in #2765
- fix: remove duplicate CompactionItem from RunItem union by @KanchiShimono in #2761
Documentation & Other Changes
- docs: add 0.13 changelog by @seratch in #2744
- docs: update translated document pages by @github-actions[bot] in #2759
- fix: harden example auto-runs against PATH and port conflicts by @seratch in #2770
- Release 0.13.1 by @github-actions[bot] in #2768
Full Changelog: v0.13.0...v0.13.1