pypi pytorch-lightning 1.1.3
Standard weekly patch release

latest releases: 2.4.0, 2.3.3, 2.3.2...
3 years ago

[1.1.3] - 2021-01-05

Added

  • Added a check for optimizer attached to lr_scheduler (#5338)
  • Added support for passing non-existing filepaths to resume_from_checkpoint (#4402)

Changed

  • Skip restore from resume_from_checkpoint while testing (#5161)
  • Allowed log_momentum for adaptive optimizers in LearningRateMonitor (#5333)
  • Disabled checkpointing, earlystopping and logging with fast_dev_run (#5277)
  • Distributed group defaults to WORLD if None (#5125)

Fixed

  • Fixed trainer.test returning non-test metrics (#5214)
  • Fixed metric state reset (#5273)
  • Fixed --num-nodes on DDPSequentialPlugin (#5327)
  • Fixed invalid value for weights_summary (#5296)
  • Fixed Trainer.test not using the latest best_model_path (#5161)
  • Fixed existence check for hparams not using underlying filesystem (#5250)
  • Fixed LightningOptimizer AMP bug (#5191)
  • Fixed casted key to string in _flatten_dict (#5354)

Contributors

@8greg8, @haven-jeon, @kandluis, @marload, @rohitgr7, @tadejsv, @tarepan, @tchaton

If we forgot someone due to not matching commit email with GitHub account, let us know :]

Don't miss a new pytorch-lightning release

NewReleases is sending notifications on new releases.