🌟 Summary
Ultralytics v8.3.235 makes IMX500 exports more powerful (now with YOLO11 instance segmentation), modernizes export backends (CoreML, ExecuTorch, ONNX/Torch), and improves logging and Jetson stability—while lifting an older PyTorch pin so you can safely use newer versions. 🚀
📊 Key Changes
-
🧠 Sony IMX500: Full YOLO11 instance segmentation support
- IMX export now supports segment in addition to detect, pose, and classify.
- New
segment_forwardpath and segmentation handling inultralytics.utils.export.imx. - IMX constraints extended in
Exporter.__call__to accept segmentation models and enforceint8=Trueandnms=Trueforsegmenttoo. - IMX NMS wrapper and autobackend updated to handle segmentation outputs (masks + protos).
-
📝 IMX500 docs & deployment examples upgraded
- IMX docs now clearly list Object Detection, Pose Estimation, Classification, and Instance Segmentation as supported tasks.
- Added Python + CLI examples for YOLO11 instance segmentation IMX export and inference.
- Added sample IMX export folder structure for
yolo11n-seg_imx_model. - New end‑to‑end Raspberry Pi AI Camera instance segmentation deployment example using Aitrios application modules.
-
🔄 IMX export robustness & dependency updates
- IMX export:
- Now requires PyTorch ≥ 2.9.0 and Python ≥ 3.9, and is blocked on ARM64 (not supported).
- Switched from
sony-custom-layerstoedge-mdt-clfor custom layers/NMS. - Updated converter/tooling requirements to
imx500-converter[pt]>=3.17.3,edge-mdt-cl<1.1.0,edge-mdt-tpc>=1.2.0,model-compression-toolkit>=2.4.1,pydantic<=2.11.7. - IMX export now runs ONNX export inside a new
onnx_export_patch()context to work around PyTorch 2.9+ ONNX export issues.
- IMX inference (autobackend):
- Uses
edge-mdt-clNMS ops andonnxruntime-extensions. - Updated handling of segmentation outputs in the ONNX/IMX path.
- Uses
- IMX export:
-
🔁 CoreML & ExecuTorch export/inference modernized
- CoreML
- Minimum
coremltoolsversion raised from>=8.0to>=9.0. - Enforced
numpy>=1.14.5,<=2.3.5for CoreML to avoid breakage with newer numpy prereleases. - Changes applied consistently in:
pyproject.tomlexport extrasexport_coremlin the exporter- CoreML autobackend loading.
- Minimum
- ExecuTorch
- ExecuTorch version bumped from
1.0.0to1.0.1for both export and inference paths. - Still checks
setuptools<71.0.0to avoid known compatibility issues.
- ExecuTorch version bumped from
- CoreML
-
🧪 IMX export test re-enabled and tightened
test_export_imxnow:- Requires Torch ≥ 2.9.0.
- Requires Python ≥ 3.9.
- Skips on Windows, macOS, and ARM64.
- Uses a configurable
MODELconstant instead of hard‑codedyolov8n.pt.
-
🐍 Python export Docker image: PyTorch pin removed (current PR focus)
- Removed Docker‑level hack that forced
torch<=2.8.0inpyproject.toml. - Export Docker image now respects standard PyTorch constraints, allowing Torch 2.9.0+, which IMX now depends on.
- Still explicitly installs
numpy==1.26.4for Sony IMX export stability. - New
onnx_export_patch()helper ensures ONNX export behaves correctly with Torch 2.9+ by temporarily disabling Dynamo during export.
- Removed Docker‑level hack that forced
-
🚀 Jetson JetPack 6 Docker: safer ONNX version
Dockerfile-jetson-jetpack6now patchespyproject.tomlto enforceonnx>=1.12.0,<1.20.0.- Prevents known TFLite export issues with
onnx 1.20.0on Jetson JetPack 6.
-
📈 ClearML & Neptune logging made future‑proof
- ClearML and Neptune callbacks now:
- Dynamically collect plots from
trainer.plotsandtrainer.validator.plots. - Skip any plot whose filename contains
"batch"(debug/per‑batch images).
- Dynamically collect plots from
- No longer rely on a hardcoded list like
results.png,confusion_matrix.png, etc.—so new/custom plots are automatically logged.
- ClearML and Neptune callbacks now:
-
🔧 Dependency installation behavior hardened
attempt_installno longer passes--prerelease=allowtouv pip install.- Reduces the chance of accidentally pulling in unstable prerelease packages into export/inference environments.
-
📚 New reference docs entries
- Added API docs for:
ultralytics.utils.export.imx.segment_forwardultralytics.utils.patches.onnx_export_patch
- Added API docs for:
-
🔖 Version bump
__version__updated from8.3.234to8.3.235.
🎯 Purpose & Impact
-
✅ Better Sony IMX500 support out of the box
- You can now run YOLO11 instance segmentation end‑to‑end on IMX500 (export and inference), including Raspberry Pi AI Camera deployments.
- Updated docs and examples make IMX workflows clearer and easier to reproduce.
-
⚙️ Modern, stable export backends
- CoreML, ExecuTorch, and IMX paths are aligned with newer, supported tool versions and safe numpy ranges.
- Reduces unexpected breakage from upstream changes while keeping you on supported, actively maintained stacks.
-
🧪 More reliable ONNX & Torch 2.9+ workflows
- Removing the Torch
<=2.8.0pin in the export Docker while addingonnx_export_patch()means:- You can benefit from newer PyTorch (including 2.9+).
- IMX and ONNX exports remain stable despite PyTorch’s Dynamo changes.
- Removing the Torch
-
🛡️ Safer Jetson deployments
- Pinning
onnx<1.20.0for JetPack 6 avoids known TFLite export bugs. - Improves reliability for YOLO exports on NVIDIA Jetson devices.
- Pinning
-
📊 Richer experiment tracking with minimal maintenance
- ClearML and Neptune now automatically log all relevant plots produced by training and validation.
- New plot types will appear in your dashboards without any callback updates.
-
🧱 More predictable environments
- Avoiding prerelease installs with
uvdecreases surprise breakages. - Tight, explicit version constraints across CoreML, ExecuTorch, IMX, and ONNX yield more reproducible pipelines across machines and CI.
- Avoiding prerelease installs with
What's Changed
- Update imx convert to
imx500-converter 3.17.3release andSegmentationsupport by @ambitious-octopus in #22146 - Update
Dockerfile-jetson-jetpack6to pinonnx<1.20.0by @Laughing-q in #22864 - Fix ClearML and Neptune plots logging by @Y-T-G in #22794
- fix: 🐞 update coreml exports and executorch update by @onuralpszr in #22872
ultralytics 8.3.235Removetorch 2.8.0pin for Sony IMX export by @lakshanthad in #22874
Full Changelog: v8.3.234...v8.3.235