Release Notes
v2.11.2
Enhancement Notes
- Refactored the processing of streaming chunks from OpenAI to simplify logic.
- Added tests to ensure expected behavior when handling streaming chunks when using include_usage=True.
Bug Fixes
- Fixed issue with MistralChatGenerator not returning a finish_reason when using streaming. Fixed by adjusting how we look for the finish_reason when processing streaming chunks. Now, the last non-None finish_reason is used to handle differences between OpenAI and Mistral.