Patch Changes
-
#10143
62ba83eThanks @topliceanurazvan! - fix(openai): emit handleLLMNewToken callback for usage chunk in Completions API streamingThe final usage chunk in
_streamResponseChunkswas only yielded via the async generator but did not callrunManager.handleLLMNewToken(). This meant callback-based consumers (e.g. LangGraph'sStreamMessagesHandler) never received theusage_metadatachunk. Added the missinghandleLLMNewTokencall to match the behavior of the main streaming loop. -
Updated dependencies [
10a876c,b46d96a]:- @langchain/core@1.1.28