🛠️ Custom Tools via local MCP servers
- You can now add your own custom tools by creating an MCP server, which can be called locally (for now).
- Added UI pages for viewing and managing MCP servers.
🤖 Ollama Integration
- Added support for Ollama as a local LLM provider.
🧠 Reasoning Models
- Added support for reasoning models O1 and O4-mini.
- Adjusted tool parameters to be fully required for compatibility.
♻️ Refactored Components
- Replaced internal duplicated UI code with shared Xata components.
- Simplifies maintenance and promotes consistency.
📈 Monitoring Overhaul
- Switched to OpenTelemetry NodeSDK with Langfuse integration.
- Removed reliance on
@vercel/otel
, improving compatibility.
🐛 Bug Fixes
- OpenAI model provider is no longer mandatory. The agent falls back to another key if, for example, you only have the Anthropic key configured. PR: #186
- Fixed issue with tool prompts referencing a non-existent tool name. PR: #173
- Resolved
dbaccess
bug affecting scheduled jobs. PR: #172 - Removed misleading thresholds from the monitoring playbook (e.g., 20GB). PR: #166
- Persist and load the connection for chats. PR: #182
🙌 Thanks to Our Contributors
- Alexis Rico (@SferaDev)
- Tudor Golubenco (@tudor)
- Elizabet Oliveira (@miukimiu)
- Steffen Siering (@steffen)
- PineappleChild (@PineappleChild)