Changes since langchain-openai==0.3.8
Support for OpenAI Responses API.
Specify use of Responses API as an init param:
from langchain_openai import ChatOpenAI
llm = ChatOpenAI(
model="gpt-4o-mini",
use_responses_api=True,
)
ChatOpenAI
will also automatically route through the Responses API if a feature specific to that API is used:
from langchain_openai import ChatOpenAI
llm = ChatOpenAI(model="gpt-4o-mini")
llm.invoke(
"What was a positive news story from today?",
tools=[{"type": "web_search_preview"}],
)
Details:
openai[patch]: release 0.3.9 (#30325)
openai[patch]: support additional Responses API features (#30322)
openai[patch]: support structured output via Responses API (#30265)
openai[patch]: support Responses API (#30231)
standard-tests, openai: bump core (#30202)