github google-deepmind/optax v0.1.5
Optax 0.1.5

latest releases: v0.2.4, v0.2.3, v0.2.2...
19 months ago

What's Changed

  • Fix arXiv link to Optax Optimistic Gradient Descent optimizer by @8bitmp3 in #458
  • Fix the Yogi optimizer paper year, change link to NeurIPS site by @8bitmp3 in #461
  • Add exponent to cosine decay schedule and warmup + cosine decay by @copybara-service in #476
  • Fix typos in docstring by @pomonam in #480
  • Fix global_norm() signature by @brentyi in #481
  • Fix inject_hyperparams() for python < 3.10. by @copybara-service in #486
  • fixed NaN issues in kl_divergence loss function by @LukasMut in #473
  • feat(ci/tests): bump setup-python version and enable cache by @SauravMaheshkar in #485
  • Better tests for utils by @acforvs in #465
  • Run Github CI every day at 03:00. by @copybara-service in #490
  • Fix JIT for piecewise_interpolate_schedule, cosine_onecycle_schedule, linear_onecycle_schedule by @brentyi in #504
  • Explicitly export "softmax_cross_entropy_with_integer_labels" by @nasyxx in #499
  • Add the Lion optimizer, discovered by symbolic program search. by @copybara-service in #500
  • Replaces references to jax.numpy.DeviceArray with jax.Array. by @copybara-service in #511
  • Update pytypes. by @copybara-service in #514
  • Fix pytype failures related to teaching pytype about NumPy scalar types. by @copybara-service in #517
  • Release v0.1.5. by @copybara-service in #523

New Contributors

Full Changelog: v0.1.4...v0.1.5

Don't miss a new optax release

NewReleases is sending notifications on new releases.