github google-deepmind/optax v0.1.1
Optax 0.1.1

latest releases: v0.2.2, v0.2.1, v0.2.0...
2 years ago

What's Changed

  • Tweak the meta-learning example from the docs by @copybara-service in #233
  • Fix small bugs in metalearning example. by @copybara-service in #236
  • Do not reuse mini-batches between epochs in DPSGD example. by @copybara-service in #230
  • Make the version of typing_extensions less constrained. by @copybara-service in #238
  • [JAX] move example libraries from jax.experimental to jax.example_libraries by @copybara-service in #200
  • Export ScaleByBeliefState by @NeilGirdhar in #239
  • MultiStep optimizer: align parameter naming and type annotations of update function with signature of GradientTransform.update. by @copybara-service in #243
  • Fix imports of datasets in examples folder. by @copybara-service in #242
  • Enable example tests on github. Fix the bugs that were uncovered. by @copybara-service in #244
  • Formatting. by @copybara-service in #249
  • Add test for multi steps wrapper, verifying that the aggregated gradient is the mean of the input gradients. by @copybara-service in #255
  • MultiStep optimizer wrapper: replace naive streaming average gradient implementation with numerically stabler one. by @copybara-service in #254
  • Added ord, axis, and keepdims args to safe_norm by @copybara-service in #252
  • Add badges and include RTD build into CI tests. by @copybara-service in #256
  • Write a clearer doc-string for GradientTransformation by @copybara-service in #257
  • Refactor clipping.py by @copybara-service in #260
  • Implement split real norm by @wdphy16 in #241
  • Monkey-patch sphinx to output correct type annotations for the most common cases (e.g. params, opt state) in the documentation. by @copybara-service in #266
  • Improve docs by @copybara-service in #268
  • Implement stateless wrapper. by @n2cholas in #246
  • Replace _ with params to ensure you can always call init with named args. by @copybara-service in #270
  • Improve docs. by @copybara-service in #269
  • Add missing ` in two places. by @copybara-service in #273
  • Add option to cache examples datasets after pre-processing. by @copybara-service in #272
  • Fix an error in README.md rendering. by @copybara-service in #275
  • Remove the old venv directory before testing the package. by @copybara-service in #289
  • Fix Yogi optimizer by @wdphy16 in #288
  • Bump ipython from 7.16.1 to 7.16.3 in /requirements by @dependabot in #286
  • Clarifies optax.adamw(mask) parameter. by @copybara-service in #284
  • Fix the link to the complex-valued optim proposal in RTD. by @copybara-service in #295
  • Implement complex norm in optimizers by @wdphy16 in #279
  • Change add_noise to match the target variance by scaling by its sqrt. by @Rupt in #294
  • Minor tweaks to the optax documentation. by @copybara-service in #297
  • Bump version to 0.1.1 from 0.1.0 by @copybara-service in #298

New Contributors

Full Changelog: v0.1.0...v0.1.1

Don't miss a new optax release

NewReleases is sending notifications on new releases.