github logancyang/obsidian-copilot 3.2.0

9 hours ago

Copilot for Obsidian - Release v3.2.0 💪

The first version of Self-Host Mode is finally here! You can simply toggle it on at the bottom of Plus settings, and your reliance on the Copilot Plus backend is gone (Believer required)!

The builtin model list has been updated, click "Refresh Built-in Models" above the model setting table to see them!

  • 🚀 Autonomous Agent Evolution — The agent experience gets a major upgrade this release!
    • New reasoning block: The new reasoning block replaces the old tool call banners for a cleaner and smoother UI in agent mode!
    • 🔧 Native tool calling: We moved to native tool calling from the XML-based approach for a more reliable tool call experience. Nowadays more and more models support native tool calling, even local models!
  • Brand new Quick Command and Editor "Quick Ask" Floating Panel! Select text in the editor and get an inline AI floating panel for quick questions — with persistent selection highlights so you never lose your place! (@Emt-lin)
  • Twitter/X thread processing: Mention a tweet thread URL in chat and Copilot will fetch the entire thread! (@logancyang)
  • Modular context compaction architecture — a cleaner, more extensible design for how Copilot manages long contexts. (@logancyang)
  • LM Studio and Ollama reasoning/thinking token support — thinking models in LM Studio and Ollama now display reasoning output properly. (@logancyang)
  • Major search improvements: better recall with note-diverse top-K scoring, and a new "Build Index" button replacing the warning triangle in Relevant Notes for a clearer UX. (@logancyang)

In the next iterations, self-host mode will let you configure your own web search and YouTube services, and integrate with our new standalone desktop app for more powerful features, stay tuned!

👨‍💻 Known Limitations: Agent mode performance varies by model, recommended models: Gemini Pro/Flash (copilot-plus-flash), Claude 4.5+ models, GPT 5+ and mini, grok 4 and fast. Many OpenRouter open source models work too but the performance can vary a lot.

More details in the changelog:

Improvements

Bug Fixes

  • #2117 Fix: increase grep limit for larger vaults and unify chunking @logancyang
  • #2137 Fix: prevent arrow keys from getting stuck in typeahead with no matches @zeroliu
  • #2140 Fix: GitHub Copilot mobile CORS bypass and auth UX improvements @Emt-lin
  • #2153 Fix LM Studio chat with only ending think tag @logancyang
  • #2157 Fix: improve mobile keyboard/navbar CSS scoping and platform detection @Emt-lin
  • #2160 Fix: remove tiktoken remote fetch from critical LLM path @logancyang
  • #2165 Fix search recall with note-diverse top-K and chunk-aware scoring @logancyang

Troubleshoot

  • If models are missing, navigate to Copilot settings -> Models tab and click "Refresh Built-in Models".
  • Please report any issue you see in the member channel!

Don't miss a new obsidian-copilot release

NewReleases is sending notifications on new releases.