Highlights
- Thinking blocks are now visible in chat: The assistant's reasoning process is rendered inline in chat bubbles as collapsible thinking blocks, giving users transparency into how responses are formed. Thinking is now enabled by default.
- Significantly improved memory and retrieval: The memory system has been overhauled with batched extraction, HyDE query expansion, MMR diversity ranking, a serendipity layer for surfacing unexpected relevant memories, and a new top-N retrieval format — resulting in richer and more relevant context across conversations.
- New
/compactslash command and context window indicator: Users can now manually trigger context compaction at any time with/compact, and a color-coded bar in the toolbar shows how full the context window is at a glance. - Expanded model support and OpenRouter catalog: The OpenRouter model catalog has been expanded with DeepSeek, Qwen, Mistral, Meta, Moonshot, and Amazon models. Anthropic's 1M context window beta and fast mode are now supported, and OpenAI reasoning effort is now wired through to the API.
- Collapsible sidebar sections, channel conversations, and macOS polish: Scheduled and Background sidebar sections are now collapsible with state persisted across restarts. Channel-bound conversations are displayed in the sidebar with appropriate read-only treatment, and numerous macOS scroll, rendering, and performance issues have been resolved.
Build: 0.5.14
Commit: 0eb4189d3
Built at: 2026-03-30 18:52:35 UTC