🌟 Summary
More robust training and downloads. This release skips optimizer updates on non‑finite gradients (NaN/Inf) to prevent crashes and centralizes asset URLs for more reliable tests and dataset downloads. 🛡️🌐
📊 Key Changes
- Training stability
- Trainer now checks for non‑finite loss before backward; on NaN/Inf it logs a warning and skips the backward pass. ⚠️
- New safety in
optimizer_step()
: gradient clipping with non‑finite detection and safe skip of the optimizer update instead of crashing. - Keeps scaler updates and gradient zeroing consistent to maintain training flow.
- Asset and download reliability
- Introduced a single
ASSETS_URL
constant and replaced hardcoded asset links across tests, data utils, benchmarks, and converters. downloads.is_url()
is now safer and faster:- Returns False (rather than asserting) on invalid URLs.
- Uses a quick HEAD request with timeout for existence checks.
safe_download()
now aliasesASSETS_URL
to the public assets host automatically, simplifying host switching.
- Introduced a single
- Logging improvements
- Trainer logs now include the actual loss value when skipping non‑finite batches for easier debugging.
- Version
- Bumped to
8.3.211
.
- Bumped to
Key PRs:
- Primary: Skip non‑finite gradients in optimizer_step() by @glenn-jocher — see PR ultralytics 8.3.211
- Supporting: Centralize
ASSETS_URL
and improve download robustness — see PR Use ASSETS_URL in tests
🎯 Purpose & Impact
- Fewer training interruptions: Prevents crashes from NaN/Inf gradients and lets long runs finish more reliably with YOLO11/YOLO26 and other Ultralytics models. 🚦
- Clearer debugging: Warnings explicitly report non‑finite loss, helping pinpoint unstable hyperparameters, data issues, or mixed‑precision edge cases.
- More reliable CI and user setups: Consistent asset hosting via
ASSETS_URL
reduces flaky tests and download issues across environments. - Backward‑compatible: No user code changes required; behavior is safer by default.
Quick start or upgrade:
pip install -U ultralytics
Tip: If you see warnings about non‑finite loss or gradients, try:
- Lowering the learning rate, reducing augmentation intensity, or disabling mixed precision (
amp=False
).
What's Changed
- Use
ASSETS_URL
in tests by @glenn-jocher in #22355 ultralytics 8.3.211
Skip non-finite gradients inoptimizer_step()
by @glenn-jocher in #22350
Full Changelog: v8.3.210...v8.3.211