✨ Features
- Local inference provider with llama.cpp backend and HuggingFace model management #6933
- Cerebras provider support #7339
- Moonshot and Kimi Code declarative providers #7304
- LMStudio declarative provider #7455
- Gateway to chat to goose via Telegram etc #7199
- MCP apps sampling support #7039
- Neighborhood extension in the Extensions Library #7328
- Computer controller overhaul with peekaboo #7342
- TUI client for goose-acp #7362
- GOOSE_SUBAGENT_MODEL and GOOSE_SUBAGENT_PROVIDER config options #7277
- Configurable OTel logging level #7271
- GoosePlatform in AgentConfig and MCP initialization #6931
- Gemini CLI streaming support via stream-json events #7244
- Expose context window utilization to agent via MOIM #7418
- Auto submit for recipes that have been accepted #6325
- Claude Code permission prompt routing for approve mode #7420
- Token counts displayed directly for "free" providers #7383
- Typescript SDK for ACP extension methods #7319
- Bedrock prompt cache #6710
🐛 Bug Fixes
- Windows MSVC linking issues #7511
- Searchbar z-index modal overlay #7502
- Out of order messages #7472
- Latest session always displays in sidebar #7489
- Color-scheme declaration for transparent MCP App iframes #7479
- Forward _meta in tool results to MCP Apps #7476
- Summon skill supporting files and directory path in load output #7457
- flake.nix build failure and deprecation warning #7408
- Truncated LLM responses detection in apps extension #7354
- Truncated tool calls that break conversation alternation #7424
- SQLite deadlocks with BEGIN IMMEDIATE #7429
- Correct colors for download progress bar #7390
- MOIM telling models to sleep while waiting for tasks #7377
- stderr noise #7346
- MCP app sampling support restored after revert #7366
- Skip whitespace-only text blocks in Anthropic message #7343
- goose-acp heap allocations #7322
- Trailing space from links #7156
- Low balance detection with prompt for top up #7166
- Display 'Code Mode' instead of 'code_execution' in CLI #7321
- Gemini CLI streaming restored #7291
- CLI handling of Reasoning content and streaming thinking display #7296
- OpenAI support for "reasoning" field alias in streaming deltas #7294
- UI revert app-driven iframe width and send containerDimensions per ext-apps spec #7300
- Ollama input limit override #7281
- Subrecipe relative path with summon #7295
- Extension selector not displaying correct enabled extensions #7290
- Subagent tool call notifications after summon refactor #7243
- UI preserve server config values on partial provider config save #7248
- Allow goose to run inside a Claude Code session #7232
- OpenAI route gpt-5 codex via responses and map base paths #7254
- Filter models without tool support from recommended list #7198
- Google thoughtSignature vagaries during streaming #7204
- Settings tabs getting cut off in narrow windows #7379
🔧 Improvements
- Simplified custom model flow with canonical models #6934
- Simplified text editor to be more like pi #7426
- CLI update replaced shell-based with native Rust implementation #7148
- OpenAI responses models with hardened event streaming handling #6831
- Canonical Models for context window sizes #6723
- Command to validate bundled extensions json #7217
- Unified streaming across all providers #7247
- New navigation settings layout options and styling #6645
- MCP-compliant theme tokens and CSS class rename #7275
- Redirect llama.cpp logs through tracing #7434
- Display working directory in UI #7419
- Remove allows_unlisted_models flag, always allow custom model entry #7255
- Local model settings access from bottom bar model menu #7378
- Use model ID everywhere for local model API #7382
- Improved link confirmation modal #7333
- Upgrade to rmcp 0.16.0 #7274
- Use working dir from session #7285
- Apps token limit #7474
- 3rd-party license copy for JavaScript/CSS minified files #7352
- Client settings improvements #7381
- Open recipe in new window passes recipe id #7392
📚 Documentation
- Groq models #7404
- Voice dictation updates #7396
- Excalidraw MCP App Tutorial #7401
- Gastown blog post: How to Use Goosetown for Parallel Agentic Engineering #7372
- Type-to-search goose configure lists #7371
- Search conversation history #7370
- Reasoning environment variable #7367
- Update skills detail page for Goose Summon extension #7350
- Sandbox topic update #7336
- Monitoring subagent activity section #7323
- Desktop UI recipe editing for model/provider and extensions #7327
- CLAUDE_THINKING_BUDGET and CLAUDE_THINKING_ENABLED environment variables #7330
- Permission Policy for MCP Apps #7325
- CLI syntax highlighting theme customization #7324
- Code Execution extension renamed to Code Mode extension #7316
- Escape variable syntax in recipes #7314
- OTel environment variable and config guides #7221
- System proxy settings #7311
- Summon extension tutorial and Skills references #7310
- Agent session id #7289
- Top Of Mind extension #7283
- Gemini 3 thinking levels #7282
- Stream subagent tool calls #7280
- Delete custom provider in desktop #7279
- Disable AI session naming #7194
- Playwright CLI skill tutorial #7261
- Community all-stars and page update #7483
- Neighborhood extension page with video embed and layout improvements #7473
- YouTube short embed to Neighborhood extension tutorial #7456
- Generate manpages #7443
- Goose v1.25.0 release blog #7433
- Order Lunch Without Leaving Your AI Agent blog #7505
- Goose in a pond blog #7465