This patch release 3.1.1 packs a punch 💪 with some significant upgrades and critical bug fixes.
- OpenRouter thinking models are supported now! As long as "Reasoning" is checked for a reasoning model from OpenRouter, the thinking block will render in chat. If you don't want to see it, simply uncheck "Reasoning" to hide it.
- Copilot can see Dataview results in the active note! 🔥🔥🔥 Simply add the active note with dataview queries to context, and the LLM will see the executed results of those queries and use them as context!
- New model provider Amazon Bedrock added! (We only support API key and region settings for now, other ways of Bedrock access are not supported)
More details in the changelog:
Improvements
- #1955 Add bedrock provider @logancyang
- #1954 Enable Openrouter thinking tokens @logancyang
- #1942 Improve custom command @zeroliu
- #1931 Improve error handling architecture across chain runners @Emt-lin
- #1929 Add CRUD to Saved Memory @wenzhengjiang
- #1928 Enhance canvas creation spec with with JSON Canvas Spec @wenzhengjiang
- #1923 Turn autosaveChat ON by default @wenzhengjiang
- #1922 Sort notes in typeahead menu by creation time @zeroliu
- #1919 Implement tag list builtin tool @logancyang
- #1918 Support dataview result in active note @logancyang
- #1914 Turn on memory feature by default @wenzhengjiang
Bug Fixes
- #1957 Fix ENAMETOOLONG error on chat save @logancyang
- #1956 Enhance error handling @logancyang
- #1950 Fix new note (renamed) not discoverable in Copilot chat @logancyang
- #1947 Stop rendering dataview result in AI response @logancyang
- #1927 Properly render pills in custom command @zeroliu