0.40.0 (2025-09-30)
BlockNote AI
We've now significantly refactored BlockNote AI to support:
- The Vercel AI SDK 5 (closes #1952)
- Now uses
streamText
+ tool calling by default instead of Object Generation - Designed to call your own backend (instead of the old proxy-based setup).
- Leans in more heavily to the updated AI SDK architecture (reusing the
Chat
andtransport
concepts) PromptBuilder
s have been redesigned to split the creation of the required data (PromptBuilderInputData
and creation / modification of the LLMMessages- fix: better handling of parallel tool calls
Backend pattern
We've revisited the old solution where BlockNote made direct calls to LLMs from the client using the Vercel AI SDK (or via a proxy).
Instead it's now recommended to send request to your backend. There, you can then invoke your LLM (potentially adding more context, tools, RAG, etc). While a bit more work to set up, this architecture is more inline with the Vercel SDK and unlocks more powerful workflows.
Breaking changes
See the updated docs and backend integration guide for an overview of the new APIs. The main breaking change is that createAIExtension
now accepts a transport
that provides the integration with your backend.
We now recommend to use the Vercel AI SDK on your backend. For alternative options, see the backend integration guide. The previous proxy-based approach is still available and explained in the backend guide.
For assistance in upgrading and integration with your pipeline, please reach out to the team.
Mantine
We've now upgraded our support of Mantine in the @blocknote/mantine
package to V8.
🚀 Features
- Mantine v8 upgrade (#2028, #2029)
- Update Mantine setup (#2033)
- ai: SDK 5, tool calling, custom backends (#2007)
- core: add the ability to autofocus on the editor element (#2018)
🩹 Fixes
- Block colors menu not always showing (#2027)
- Update remianing examples to Mantine v8 (#2031)
- ShadCN example Tailwind setup (#2042)
❤️ Thank You
- Matthew Lipski @matthewlipski
- Nick Perez
- Yousef