github Lightning-AI/pytorch-lightning 2.6.1
Lightning v2.6.1

4 hours ago

Changes in 2.6.1

PyTorch Lightning

Added
  • Added method chaining support to LightningModule.freeze() and LightningModule.unfreeze() by returning self (#21469)
  • Added litlogger integration(#21430)
Deprecated
  • Deprecated to_torchscript method due to deprecation of TorchScript in PyTorch (#21397)
Removed
  • Removed support for Python 3.9 due to end-of-life status (#21398)
Fixed
  • Fixed save_hyperparameters(ignore=...) behavior so subclass ignore rules override base class rules (#21490)
  • Fixed LightningDataModule.load_from_checkpoint to restore the datamodule subclass and hyperparameters (#21478)
  • Fixed ModelParallelStrategy single-file checkpointing when torch.compile wraps the model so optimizer states no longer raise KeyError during save (#21357)
  • Sanitize profiler filenames when saving to avoid crashes due to invalid characters (#21395)
  • Fixed StochasticWeightAveraging with infinite epochs (#21396)
  • Fixed _generate_seed_sequence_sampling function not producing unique seeds (#21399)
  • Fixed ThroughputMonitor callback emitting warnings too frequently (#21453)

Lightning Fabric

Added
  • Exposed weights_only argument for loading checkpoints in Fabric.load() and Fabric.load_raw() (#21470)
Fixed
  • Fixed DistributedSamplerWrapper not forwarding set_epoch to the underlying sampler (#21454)
  • Fixed DDP notebook CUDA fork check to allow passive initialization when CUDA is not actively used (#21402)

Full commit list: 2.6.0 -> 2.6.1

Contributors

New Contributors

We thank all folks who submitted issues, features, fixes and doc changes. It's the only way we can collectively make Lightning ⚡ better for everyone, nice job!

In particular, we would like to thank the authors of the pull-requests above

Don't miss a new pytorch-lightning release

NewReleases is sending notifications on new releases.