[1.3.2] - 2021-05-18
Changed
DataModule
s now avoid duplicate{setup,teardown,prepare_data}
calls for the same stage (#7238)
Fixed
- Fixed parsing of multiple training dataloaders (#7433)
- Fixed recursive passing of
wrong_type
keyword argument inpytorch_lightning.utilities.apply_to_collection
(#7433) - Fixed setting correct
DistribType
forddp_cpu
(spawn) backend (#7492) - Fixed incorrect number of calls to LR scheduler when
check_val_every_n_epoch > 1
(#7032)
Contributors
@alanhdu @carmocca @justusschock @tkng
If we forgot someone due to not matching commit email with GitHub account, let us know :]