github google-deepmind/optax v0.2.2
Optax 0.2.2

one month ago

What's Changed

  • Added mathematical description to Noisy SGD by @hmludwig in #857
  • Use sphinx-contributors for an automated contributors list. by @fabianp in #841
  • Implementation of the Polyak SGD solver by @copybara-service in #718
  • Document the extra args of the update function in docstring by @copybara-service in #864
  • Utility to set value in a pytree (and so in state) by @copybara-service in #865
  • Added mathematical description to AdaBelief docstring by @hmludwig in #869
  • FIX RST formatting in inject hyperparams by @fabianp in #867
  • Warn that in future arguments after the initial (prediction, ground_truth) positional arguments will become keyword-only in optax losses. by @copybara-service in #863
  • Upstream missing jaxopt losses to optax - Part 2/N by @copybara-service in #872
  • Fix error reduce_on_plateau.ipynb:20002: WARNING: No source code lexer found for notebook cell by @copybara-service in #875
  • docstring cosmetic improvements by @fabianp in #879
  • Extend capabilities of tree_get, tree_set. by @copybara-service in #878
  • [DOC] Add to the gallery an example on a small language model by @copybara-service in #866
  • Update reduce_on_plateau to handle training average loss. by @copybara-service in #883
  • Fix notebook reduce_on_plateau by @copybara-service in #887
  • ENH: extend power_iteration to accept a matrix in implicit form by @copybara-service in #858
  • Document changes in power_iteration by @copybara-service in #889
  • Release of version 0.2.2 by @copybara-service in #892

Full Changelog: v0.2.1...v0.2.2

Don't miss a new optax release

NewReleases is sending notifications on new releases.