Highlights
D2L is now runnable on Amazon SageMaker and Google Colab.
New Contents
The following chapters are re-organized:
- Natural Language Processing: Pretraining
- Natural Language Processing: Applications
The following sections are added:
- Subword Embedding (Byte-pair encoding)
- Bidirectional Encoder Representations from Transformers (BERT)
- The Dataset for Pretraining BERT
- Pretraining BERT
- Natural Language Inference and the Dataset
- Natural Language Inference: Using Attention
- Fine-Tuning BERT for Sequence-Level and Token-Level Applications
- Natural Language Inference: Fine-Tuning BERT
Improvements
There have been many light revisions and improvements throughout the book.