github yazinsai/OpenOats v1.60.1
OpenOats 1.60.1 — LLM timeout fix

6 hours ago

What's Changed

  • fix: raise LLM request timeout to 300s to unblock slow completions (#434) — URLRequest's default 60s idle timeout was killing notes generation and other LLM calls when using cold local models (Ollama/MLX) or reasoning models with long first-token latency. Both streaming and non-streaming paths in OpenRouterClient now use a 300s timeout.

Contributors

Thanks to @BJonny for identifying and fixing the timeout issue.

Don't miss a new OpenOats release

NewReleases is sending notifications on new releases.