MLflow 3.6.0rc0 includes several major features and improvements!
Major Features
- 🔗 Full OpenTelemetry Support in OSS Server: MLflow now offers comprehensive OpenTelemetry integration, allowing you to use OpenTelemetry and MLflow SDK together for constructing unified traces with full OTLP span ingestion. (#18540, #18532, #18357, @B-Step62, @serena-ruan)
- 💬 Session-level View in Trace UI: New chat sessions tab provides a dedicated view for organizing and analyzing related traces at the session level, making it easier to track conversational workflows. (#18594, @daniellok-db)
- 🧭 New experiment tab bar: The experiment tab navigation bar has been moved from the top of the page to the left side. As MLflow continues to grow, this layout provides more room to add new tabs while keeping everything easy to find. (#18594, @daniellok-db)
- 🚀 Vercel AI Support in TypeScript Tracing SDK: Auto-tracing support for Vercel AI SDK in TypeScript, expanding MLflow's observability capabilities across popular JavaScript/TypeScript frameworks. (#18402, @B-Step62)
- 💰 Tracking Judge Cost and Traces: Comprehensive tracking of LLM judge evaluation costs and traces, providing visibility into evaluation expenses and performance with automatic cost calculation and rendering. (#18481, #18484, @B-Step62)
- ⚙️ Agent Server: New agent server infrastructure for managing and deploying scoring agents with enhanced orchestration capabilities. (#18596, @bbqiu)
Breaking Changes and deprecations
- [Tracking] Filesystem Backend Deprecation: The filesystem backend is being deprecated in favor of SQLite. See #18534 for details.
- [Flavors] Deprecate promptflow flavor (#18597, @copilot-swe-agent)
- [Flavors] Deprecate pmdarima and diviner flavors (#18577, @copilot-swe-agent)
- [Tracing] Drop span name deduplication (#18531, @serena-ruan)
Stay tuned for the full release, which will be packed with more features and bugfixes.
To try out this release candidate, please run: pip install mlflow==3.6.0rc0