第二章:自然语言处理
以下是本节中的文章:
Recurrent Neural Network based Language Model
Learning Phrase Representations using RNN Encoder-Decoder for Statistical Machine Translation
Neural Machine Translation by Jointly Learning to Align and Translate
Hierarchical Attention Networks for Document Classification
Connectionist Temporal Classification : Labelling Unsegmented Sequence Data with Recurrent Neural Ne
About Long Short Term Memory
Attention Is All you Need
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding