github langchain-ai/langchain langchain-openai==0.3.0

latest releases: langchain-tests==0.3.8, langchain-pinecone==0.2.2, langchain-xai==0.2.0...
12 hours ago

langchain-openai==0.3 implements two breaking changes:

Structured output

We update the default method parameter for ChatOpenAI(...).with_structured_output(method=<method>) from method="function_calling" to method="json_schema".

For schemas specified via TypedDict or JSON schema, strict schema validation is disabled by default but can be enabled by specifying strict=True.

Note: conceptually there is a difference between forcing a tool call and forcing a response format. Tool calls may have more concise arguments versus generating content adhering to a schema. Prompts may need to be adjusted to recover desired behavior.

How to retain the 0.2 with_structured_output behavior after upgrading to 0.3

To change this behavior back, you can pass method="function_calling" to your with_structured_output calls that you want to switch the behavior back.

Expected errors

  1. Models that don’t support method="json_schema" (e.g., gpt-4 and gpt-3.5-turbo, currently the default model for ChatOpenAI) will raise an error unless method is explicitly specified. To recover the previous default, pass method="function_calling" into with_structured_output.

  2. Schemas specified via Pydantic BaseModel that have fields with non-null defaults or metadata (like min/max constraints) will raise an error. To recover the previous default, pass method="function_calling" into with_structured_output. See OpenAI's docs for supported schemas.

Optional parameters

We no longer implement non-null defaults for temperature, max_retires, and n, which are optional fields. In particular, we no longer specify a default temperature of 0.7.

The previous defaults can be set by specifying:

  • temperature=0.7
  • max_retries=2
  • n=1

Don't miss a new langchain release

NewReleases is sending notifications on new releases.