[3.30.0] - 2025-11-03
- Feat: Add OpenRouter embedding provider support (#8972 by @dmarkey, PR by @dmarkey)
- Feat: Add GLM-4.6 model to Fireworks provider (#8752 by @mmealman, PR by @app/roomote)
- Feat: Add MiniMax M2 model to Fireworks provider (#8961 by @dmarkey, PR by @app/roomote)
- Feat: Add preserveReasoning flag to include reasoning in API history (thanks @daniel-lxs!)
- Fix: Prevent message loss during queue drain race condition (#8536 by @hannesrudolph, PR by @daniel-lxs)
- Fix: Capture the reasoning content in base-openai-compatible for GLM 4.6 (thanks @mrubens!)
- Fix: Create new Requesty profile during OAuth (thanks @Thibault00!)
- Fix: Prevent UI flicker and enable resumption after task cancellation (thanks @daniel-lxs!)
- Fix: Cleanup terminal settings tab and change default terminal to inline (thanks @hannesrudolph!)