github google-deepmind/optax v0.2.6
Optax 0.2.6

10 hours ago

What's Changed

  • Fix for #1328 by @copybara-service[bot] in #1329
  • Make pip quiet in a notebook by @copybara-service[bot] in #1330
  • Clean up freezing doctests by @rdyro in #1333
  • Fix rendering issue of Freezing in transformations api page in the documentation by @rajasekharporeddy in #1331
  • Remove the reference to optax.transforms in freezing documentation by @rajasekharporeddy in #1334
  • Fix for #1335 by @rdyro in #1336
  • Add Salimans et al. 2017 citation to make_perturbed_fun docstring. by @carlosgmartin in #1325
  • Add tree utility functions. by @carlosgmartin in #1321
  • Add tests to verify cross_entropy_losses accept per-logit masks. by @copybara-service[bot] in #1343
  • Remove reliance on chex.dataclass since it's not supported in newest JAX by @copybara-service[bot] in #1350
  • Add line too long (E501) to optax source code by @rdyro in #1347
  • Simplify code by using new tree.size function. by @carlosgmartin in #1354
  • Enable adaptive gradient clipping for high-dimensional tensors by @aymuos15 in #1340
  • Extend the fromage optimizer to allow a learning rate schedule by @rdyro in #1359
  • Fix ruff to check line-length=80 by @rdyro in #1360
  • Add function tree_allclose. by @carlosgmartin in #1352
  • fix CI failure from line-too-long by @copybara-service[bot] in #1361
  • Fix gradient NaN issues in sigmoid_focal_loss for extreme logits by @leochlon in #1346
  • Internal changes by @copybara-service[bot] in #1367
  • Clean up and fix errors in DoG implementation and documentation. by @carlosgmartin in #1292
  • Trimming the library. by @copybara-service[bot] in #1370
  • Address optimistic_adam interface re-work in the documentation. by @copybara-service[bot] in #1381
  • Small docs fixes by @copybara-service[bot] in #1382
  • Add missing entry for tree_cast_like in utilities.rst. by @carlosgmartin in #1377
  • Remove type hint in test to align with new jax.nn annotations by @copybara-service[bot] in #1385
  • Bump jax version for optax by @copybara-service[bot] in #1392
  • Simplify l2 projection by @copybara-service[bot] in #1394
  • Make init_empty_state public by @copybara-service[bot] in #1395
  • Use OrderedDict in named_chain to preserve transformation order in the state object through jax.jit. by @copybara-service[bot] in #1397
  • Fix hlo equivalence test for abs_sqr, fix broken html links by @copybara-service[bot] in #1404
  • Add pyink config for external PRs (optional) by @copybara-service[bot] in #1409
  • Expose scale by muon mask in the muon alias by @copybara-service[bot] in #1407
  • add segmentation based (dice) loss by @aymuos15 in #1366
  • fix CI by fixing pylint errors by @copybara-service[bot] in #1411
  • Add explanation to Newton Schulz step by @copybara-service[bot] in #1410
  • Fix doctests: add necessary dependency for sphinx-collections by @copybara-service[bot] in #1417
  • Add missing equations to optax.optimistic_gradient_descent. by @carlosgmartin in #1400
  • Fix dtype casting inside tree_add_scale. by @carlosgmartin in #1376
  • Update version number for release. by @copybara-service[bot] in #1419

New Contributors

Full Changelog: v0.2.5...v0.2.6

Don't miss a new optax release

NewReleases is sending notifications on new releases.