[1.5.3] - 2021-11-24
Fixed
- Fixed
ShardedTensor
state dict hook registration to check if torch distributed is available (#10621) - Fixed an issue with
self.log
not respecting a tensor'sdtype
when applying computations (#10076) - Fixed LigtningLite
_wrap_init
popping unexisting keys from DataLoader signature parameters (#10613) - Fixed signals being registered within threads (#10610)
- Fixed an issue that caused Lightning to extract the batch size even though it was set by the user in
LightningModule.log
(#10408) - Fixed
Trainer(move_metrics_to_cpu=True)
not moving the evaluation logged results to CPU (#10631) - Fixed the
{validation,test}_step
outputs getting moved to CPU withTrainer(move_metrics_to_cpu=True)
(#10631) - Fixed signals being registered within threads (#10610)
- Fixed an issue with collecting logged test results with multiple dataloaders (#10522)
Contributors
@ananthsub @awaelchli @carmocca @jiwidi @kaushikb11 @qqueing @rohitgr7 @shabie @tchaton
If we forgot someone due to not matching commit email with GitHub account, let us know :]