github machinewrapped/llm-subtrans v1.5.6
Improved error handling and reporting for OpenAI

20 hours ago

Extended the OpenAI reasoning model client to handle different error types from the API to report explanatory/actionable error messages when a translation request fails.

Additionally, streamed translations are now disabled by default for OpenAI as streaming a reasoning model's response apparently requires a verified account. Thanks to Neoony for the efforts in diagnosing the issue!

Streaming remains the default for other providers (where supported), as this restriction seems to be unique to OpenAI.

Don't miss a new llm-subtrans release

NewReleases is sending notifications on new releases.