Minor Changes
-
#9268
48c0553- Open xterm.js-powered terminal tabs in the Agent Manager. Click the chevron next to the+tab button and pick "New Terminal" (or pressCmd+Shift+T/Ctrl+Shift+T) to spawn a real shell in the selected worktree or Local directory. Terminals render as proper tabs alongside agent sessions, support mixed drag-reorder with session tabs, and persist their position across webview reloads. The existing VS Code integrated terminal shortcut (Cmd+/) is unchanged. -
#9336
85c578e- Add the initial JetBrains session chat UI and improve sandbox debug logging for tracing chat events across frontend and backend.
Patch Changes
-
#9335
6015ac6- Restore explicit Submit behavior for single-choice question prompts in the VS Code extension so option clicks stay visible for review instead of immediately sending the answer. -
#9332
0bda9d1- Fix mid-turn message handling so a new prompt sent while the assistant is working no longer aborts the in-flight response. The current LLM reply streams to completion, any pending suggestion or question is automatically dismissed, and the new prompt runs immediately after the current step instead of waiting for the entire multi-step turn to finish. -
#9119
8e75084- Fix TUI freeze on huge-file diffs. Session-summary and file-view patches now use git directly instead of a JavaScript Myers implementation, so files of any size render a full diff without blocking the session. -
#9341
00ec003- Significantly speed up LLM token streaming in long sessions. The chat view now stays responsive while the model streams a reply, even in sessions with hundreds of messages. Previously, each SSE batch produced ~1.3 seconds of visible freeze (roughly 80 dropped frames); streaming ticks are now inside a single animation frame. -
Updated dependencies [
00ec003]:- @opencode-ai/ui@7.2.21
- @kilocode/kilo-ui@7.2.21