pypi ultralytics 8.4.34
v8.4.34 - `ultralytics 8.4.34` Multi-dataset support for hyperparameter tuning (#24067)

7 hours ago

🌟 Summary

Ultralytics v8.4.34 is a tuning and stability-focused release 🚀, led by a major new feature: multi-dataset hyperparameter tuning in one run, plus several important reliability fixes and broad YOLO26 documentation updates.

📊 Key Changes

  • 🧠 Major feature (PR #24067 by @Laughing-q): Multi-dataset hyperparameter tuning

    • model.tune() now accepts data as either a single dataset or a list.
    • During each tuning iteration, training runs across each dataset, then combines results.
    • Fitness is averaged across datasets, so tuning decisions reflect overall performance, not just one dataset.
    • Tests were updated to validate this workflow (coco8.yaml + coco8-grayscale.yaml).
    • Version bumped to 8.4.34.
  • 🛡️ Training resume stability fix (PR #24085 by @Y-T-G)

    • Prevents loss spikes after resume on small datasets by keeping AdamW’s exp_avg_sq state in FP32.
    • Reduces risk of unstable training when loading checkpoints.
  • 🔒 Thread-safe ONNX export (PR #24092 by @glenn-jocher)

    • Added export locking so concurrent threads cannot collide in PyTorch’s global ONNX exporter state.
    • Includes a regression test for parallel export safety.
  • ⚙️ Robustness fixes in core runtime

    • DDP cleanup now safely handles command-generation failures (PR #24056 by @nameearly).
    • AAttn fixed for non-divisible dim/num_heads cases to avoid shape/group crashes (PR #24114 by @ZoomZoneZero).
    • crop_mask() now clamps negative coordinates before cropping for safer segmentation postprocessing (PR #24115 by @Y-T-G).
    • draw_specific_kpts() now respects user-provided keypoint index order and handles missing confidence values safely (PR #24099 by @onuralpszr).
  • 📚 Documentation and ecosystem refresh (many PRs)

    • Large migration to YOLO26 references and fresh benchmarks across Jetson + SAM docs.
    • Jetson setup improved with missing cuDSS dependency instructions for Torch 2.10.0 (PR #24081 by @lakshanthad).
    • DeepStream version mapping/docs links corrected and expanded for newer JetPack versions (PRs #24141, #24142).
    • Ultralytics Platform docs improved (Smart Annotation with SAM + YOLO, split redistribution UX, account settings clarity, banner/link updates).

🎯 Purpose & Impact

  • Better real-world tuning quality 🎯
    Multi-dataset tuning helps teams optimize one model for mixed or varied data domains (for example, color + grayscale, or multiple data sources), improving generalization and reducing overfitting to a single dataset.

  • More reliable training and export workflows
    Resume training is more stable, distributed cleanup is safer, and ONNX export is more dependable in threaded environments—especially useful in production pipelines.

  • Improved deployment and edge guidance 📱
    Updated YOLO26 Jetson benchmarks and setup docs make edge deployment decisions more current and practical.

  • Cleaner user experience in docs and platform onboarding
    Better Smart Annotation and dataset split guidance can reduce setup friction and speed up annotation/training workflows on the Ultralytics Platform.

What's Changed

New Contributors

Full Changelog: v8.4.33...v8.4.34

Don't miss a new ultralytics release

NewReleases is sending notifications on new releases.