github langchain-ai/langchain langchain-openai==0.1.9

Release langchain-openai==0.1.9

Changes since langchain-openai==0.1.8

  • get_num_tokens_from_messages will now estimate token consumption for images as described in OpenAI's documentation here.
  • Token usage information can now be toggled in streaming mode using the stream_usage parameter.
  • Invoke and streaming responses now include model version metadata. System fingerprint was also added to streaming.
  • Option to disable parallel tool calls (parallel_tool_calls) documented in API reference.
  • Misc improvements and bug fixes.

Commits:

openai: release 0.1.9 (#23263)
partners[minor]: Fix value error message for with_structured_output (#22877)
infra: add more formatter rules to openai (#23189)
openai[patch]: image token counting (#23147)
openai[patch], standard-tests[patch]: don't pass in falsey stop vals (#23153)
standard-tests[patch]: Update chat model standard tests (#22378)
openai[patch]: add stream_usage parameter (#22854)
[Partner]: Add metadata to stream response (#22716)
partners: fix numpy dep (#22858)
openai: add parallel_tool_calls to api ref (#22746)
openai[patch]: correct grammar in exception message in embeddings/base.py (#22629)
openai, azure: update model_name in ChatResult to use name from API response (#22569)
docs: update anthropic chat model (#22483)
openai: update ChatOpenAI api ref (#22324)

Don't miss a new langchain release

NewReleases is sending notifications on new releases.