github langchain-ai/langchain langchain-openai==1.0.0a2

one day ago

When interacting with the Responses API, langchain-openai now defaults to storing response items in message content. This behavior was previously opt-in by specifying output_version="responses/v1" when instantiating ChatOpenAI. This was done to resolve BadRequestErrors that can arise in some multi-turn contexts.

To restore previous behavior, set the LC_OUTPUT_VERSION environment variable to v0, or specify output_version="v0" when instantiating ChatOpenAI:

os.environ["LC_OUTPUT_VERSION"] = "v0"

# or

from langchain_openai import ChatOpenAI

llm = ChatOpenA(model="...", output_version="v0")

Don't miss a new langchain release

NewReleases is sending notifications on new releases.