Added
-
Added pytorch_lightning spark estimator which enables training pytorch_lightning models. (#2713)
-
Added NVTX tracing hooks for profiling with Nsight Systems. (#2723)
-
Added a generic
num_workers
API forRayExecutor
(#2870) -
Supports Ray Client without code changes. (#2882)
-
Supports inmemory cache option for Keras Estimator. (#2896)
-
Added FP16 support for GPU tensor in mxnet. (#2915)
-
Added response caching for allgather operations. (#2872)
-
Estimator: add petastorm reader_pool_type into constructor (#2903)
Changed
-
Changed
alltoall
to return the received splits as a second return value if non-uniform splits are sent. (#2631) -
Changed
RayExecutor
to use Ray Placement Groups for worker colocation. (#2824) -
Changed
Inmemory dataloader
usage for Torch Estimator with petastorm v0.11.0 release. (#2896)