Changes in 2.6.1
PyTorch Lightning
Added
- Added method chaining support to
LightningModule.freeze()andLightningModule.unfreeze()by returningself(#21469) - Added litlogger integration(#21430)
Deprecated
- Deprecated
to_torchscriptmethod due to deprecation of TorchScript in PyTorch (#21397)
Removed
- Removed support for Python 3.9 due to end-of-life status (#21398)
Fixed
- Fixed
save_hyperparameters(ignore=...)behavior so subclass ignore rules override base class rules (#21490) - Fixed
LightningDataModule.load_from_checkpointto restore the datamodule subclass and hyperparameters (#21478) - Fixed
ModelParallelStrategysingle-file checkpointing whentorch.compilewraps the model so optimizer states no longer raiseKeyErrorduring save (#21357) - Sanitize profiler filenames when saving to avoid crashes due to invalid characters (#21395)
- Fixed
StochasticWeightAveragingwith infinite epochs (#21396) - Fixed
_generate_seed_sequence_samplingfunction not producing unique seeds (#21399) - Fixed
ThroughputMonitorcallback emitting warnings too frequently (#21453)
Lightning Fabric
Added
- Exposed
weights_onlyargument for loading checkpoints inFabric.load()andFabric.load_raw()(#21470)
Fixed
Full commit list: 2.6.0 -> 2.6.1
Contributors
New Contributors
- @arrdel made their first contribution in #21402
- @CodeVishal-17 made their first contribution in #21470
- @aditya0by0 made their first contribution in #21478
We thank all folks who submitted issues, features, fixes and doc changes. It's the only way we can collectively make Lightning ⚡ better for everyone, nice job!
In particular, we would like to thank the authors of the pull-requests above