Mastra Release - 2025-10-03
This release includes improvements to documentation, playground functionality, API naming conventions, and various bug fixes across the platform.
Agents
- Reorganizes the agent memory documentation by explaining async memory configuration, introducing runtime context, and moving detailed content to the appropriate Memory section. #8410
CLI / Playground
- Fixes a bug where the shell option was breaking server startup on Windows environments. #8377
- Adds a dedicated authentication token specifically for the Playground environment. #8420
- Fixes an issue in the playground UI by properly initializing message history for v1 models, ensuring history renders correctly when refreshing a thread. #8427
Client SDK - JS
- Fixes a race condition in the client-js library by ensuring that WritableStream operations await the completion of ongoing pipeTo() calls, preventing locked stream errors and production crashes. #8346
- Adds GenerateVNext support to the React SDK and introduces a function to convert UIMessages to assistant-ui messages. #8345
- Fixes issues with the custom AI SDK output. #8414
Core Platform Components
- [IMPORTANT] Updates API and SDK naming by renaming 'generateVNext' to 'generate' and 'streamVNext' to 'stream', moving previous versions to 'generateLegacy' and 'streamLegacy', and updates all related code, documentation, and examples for consistency and backwards compatibility. #8097
- [TIER 2] Improves structured output handling by converting it from an output processor to an EventEmitter-based stream processor, enabling multiple consumers and direct streaming of structured agent output, while also removing legacy structuredOutput usage. #8229
Deployer
- [TIER 2] Adds support for per-function configuration overrides (maxDuration, memory, regions) in the Vercel deployer via a centralized vcConfigOverrides option, merges these into the generated .vc-config.json, extracts config types for clarity, and updates code style, all while maintaining backward compatibility. #8339
- Adds support for resolving transitive dependencies in monorepos during development in the deployer. #8353
Developer Tools & UI
- Fixes an issue where working memory and semantic recall were not being displayed in the UI. #8358
- Improves the color contrast for code blocks in legacy traces to improve readability. #8385
- Updates the thread display by showing messages in descending order and includes the thread title. #8381
- Fixes issues with model router documentation generation and the playground UI's model picker, including logic errors, copy improvements, UI bugs, environment variable display, and adds responsive design for better mobile support. #8372
MCP
- Updates MCPServer prompts and resource callbacks to access the 'extra' property, including AuthInfo, allowing for authenticated or personalized server interactions. #8233
Memory
- Improves the memory indicator UX by replacing the previous small indicator with a shared Alert component, now displayed on the agent sidebar. #8382
- Fixes the persistence of output processor state across LLM execution steps, ensuring processors retain their state and structured output is generated correctly, while also updating controller references and preventing premature 'finish' chunk processing. #8373
Networks
- [TIER 2] Migrates agent network functionality to the new streamlined agent API, removing the separate vNext network implementation from the playground. #8329
Observability
- Enables observability by default for all templates. #8380
Prod analytics
- Adds a 3-second fetch interval to AI traces, making the UI and trace details update more responsively in real time. #8386
Tools
- [TIER 2] Adds human-in-the-loop capabilities with tool call approval, allowing users to review and approve/decline tool executions before they run. #8360
Workflows
- Fixes a bug where the resourceId was lost when resuming workflows after a server restart by ensuring it is correctly passed through all relevant server handlers and the core workflow's resume logic. #8359
- [TIER 2] Adds the ability to resume and observe interrupted workflow streams in the playground, allowing users to continue streaming results after a workflow is suspended or the frontend stream is closed. #8318