v0.4.8 — LLM Tool & Plugin Fix
Features
call_llmtool — Invoke a secondary LLM for focused subtasks like summarization, classification, and structured extraction. Supported across all three backends: Claude, Codex, and Copilot. Works with both API key (full features) and OAuth (basic features). Supports parallel calls, file attachments, and structured JSON output (eb0a3476, fa91af4c, b6e5c85f)
Bug Fixes
- Plugin name resolution — Skills failed to resolve when the workspace directory name didn't match the SDK plugin name. Now reads the actual plugin name from
.claude-plugin/plugin.json(b7904cb7) - Skill live reload — Adding a workspace skill caused global and project skills to disappear until restart. All reload paths now use
loadAllSkillsto return the full three-tier list (4172fd82) - Codex event queue race condition — Tool results and assistant text could be lost when async
item/completedhandlers were still running atturn/completed. Fixed by deferring queue completion until all handlers finish (fa91af4c)
Internal
- Copilot
runMiniCompletionnow functional — enables title generation on the Copilot backend (fa91af4c) - Copilot event adapter suppresses reasoning/intent events (fa91af4c)
call_llmmodel badge shown in TurnCard activity rows (8bb4bcd2)