Skip to content

Latest commit

 

History

History
40 lines (33 loc) · 2.48 KB

File metadata and controls

40 lines (33 loc) · 2.48 KB

Deep learning in text-level

Paper lists

Neural Machine Translation

  • Sutskever et al. Sequence-to-sequence learning with neural networks [paper] [notes]

    • End-to-end approach to sequence learning by using a multilayer LSTM to map input sequence to a fixed vector, and another LSTM to decode the fixed vector into a target sequence
  • Bahdanau et al. Neural machine translation by jointly learning to align and translate [paper] [notes]

    • Attentional Neural Machine Translation
      • first introduces term "attention"
      • uses bidirectional RNN
  • Lee et al. Fully character-level neural machine translation without explicit segmentation [paper] [notes]

    • Displays character-to-character NMT model without explicit segmentation

Text classification

  • Kim, Convolutional Neural Networks for Sentence Classification [paper] [notes]

    • Basic implementation of a CNN that takes sentences as inputs
    • Surprisingly simple, yet good results
  • Lai et al, Recurrent Convolutional Neural Networks for Text Classification [paper] [notes]

    • a form of BiRNN to get short-term context + max-pooling
    • not sure if it can be called RNN + CNN when only one max-pooling layer was used
  • Zhou et al, Text Classification Improved by Integrating Bidirectional LSTM with Two-dimensional Max Pooling [paper] [notes]