Areas of improvement
- RNN improvements:
- Refactor RNN layers to rely on atomic RNN cells. This makes the creation of custom RNN very simple and user-friendly, via the
RNN
base class. - Add ability to create new RNN cells by stacking a list of cells, allowing for efficient stacked RNNs.
- Add
CuDNNLSTM
andCuDNNGRU
layers, backend by NVIDIA's cuDNN library for fast GPU training & inference. - Add RNN Sequence-to-sequence example script.
- Add
constants
argument inRNN
'scall
method, making RNN attention easier to implement.
- Refactor RNN layers to rely on atomic RNN cells. This makes the creation of custom RNN very simple and user-friendly, via the
- Easier multi-GPU data parallelism via
keras.utils.multi_gpu_model
. - Bug fixes & performance improvements (in particular, native support for NCHW data layout in TensorFlow).
- Documentation improvements and examples improvements.
API changes
- Add "fashion mnist" dataset as
keras.datasets.fashion_mnist.load_data()
- Add
Minimum
merge layer askeras.layers.Minimum
(class) andkeras.layers.minimum(inputs)
(function) - Add
InceptionResNetV2
tokeras.applications
. - Support
bool
variables in TensorFlow backend. - Add
dilation
toSeparableConv2D
. - Add support for dynamic
noise_shape
inDropout
- Add
keras.layers.RNN()
base class for batch-level RNNs (used to implement custom RNN layers from a cell class). - Add
keras.layers.StackedRNNCells()
layer wrapper, used to stack a list of RNN cells into a single cell. - Add
CuDNNLSTM
andCuDNNGRU
layers. - Deprecate
implementation=0
for RNN layers. - The Keras progbar now reports time taken for each past epoch, and average time per step.
- Add option to specific resampling method in
keras.preprocessing.image.load_img()
. - Add
keras.utils.multi_gpu_model
for easy multi-GPU data parallelism. - Add
constants
argument inRNN
'scall
method, used to pass a list of constant tensors to the underlying RNN cell.
Breaking changes
- Implementation change in
keras.losses.cosine_proximity
results in a different (correct) scaling behavior. - Implementation change for samplewise normalization in
ImageDataGenerator
results in a different normalization behavior.
Credits
Thanks to our 59 contributors whose commits are featured in this release!
@alok, @Danielhiversen, @Dref360, @HelgeS, @JakeBecker, @MPiecuch, @MartinXPN, @RitwikGupta, @TimZaman, @adammenges, @aeftimia, @ahojnnes, @akshaychawla, @alanyee, @aldenks, @andhus, @apbard, @aronj, @bangbangbear, @bchu, @bdwyer2, @bzamecnik, @cclauss, @colllin, @datumbox, @deltheil, @dhaval067, @durana, @ericwu09, @facaiy, @farizrahman4u, @fchollet, @flomlo, @fran6co, @grzesir, @hgaiser, @icyblade, @jsaporta, @julienr, @jussihuotari, @kashif, @lucashu1, @mangerlahn, @myutwo150, @nicolewhite, @noahstier, @nzw0301, @olalonde, @ozabluda, @patrikerdes, @podhrmic, @qin, @raelg, @roatienza, @shadiakiki1986, @smgt, @souptc, @taehoonlee, @y0z