深度学习
Ctrlk
  • 前言
  • 第一章:经典网络
  • 第二章:自然语言处理
    • Recurrent Neural Network based Language Model
    • Learning Phrase Representations using RNN Encoder-Decoder for Statistical Machine Translation
    • Neural Machine Translation by Jointly Learning to Align and Translate
    • Hierarchical Attention Networks for Document Classification
    • Connectionist Temporal Classification : Labelling Unsegmented Sequence Data with Recurrent Neural Ne
    • About Long Short Term Memory
    • Attention Is All you Need
    • BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
  • 第三章:语音识别
  • 第四章:物体检测
  • 第五章:光学字符识别
  • 第六章:语义分割
  • 第七章:人脸识别
  • 第八章:网络优化
  • 第九章:生成对抗网络
  • 其它应用
  • Tags
  • References
由 GitBook 提供支持
在本页

这有帮助吗?

第二章:自然语言处理

Recurrent Neural Network based Language ModelLearning Phrase Representations using RNN Encoder-Decoder for Statistical Machine TranslationNeural Machine Translation by Jointly Learning to Align and TranslateHierarchical Attention Networks for Document ClassificationConnectionist Temporal Classification : Labelling Unsegmented Sequence Data with Recurrent Neural NeAbout Long Short Term MemoryAttention Is All you NeedBERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
上一页实例解析:12306验证码破解下一页Recurrent Neural Network based Language Model

最后更新于5年前

这有帮助吗?