Neural NLP

CONVOLUTION NEURAL NETS (CNN)

  1. Cnn for text - tal perry

SEQ2SEQ SEQUENCE TO SEQUENCE

  1. Keras blog - char-level, token-using embedding layer, teacher forcing

  1. Incorporating Copying Mechanism in Sequence-to-Sequence Learning - In this paper, we incorporate copying into neural network-based Seq2Seq learning and propose a new model called CopyNet with encoder-decoder structure. CopyNet can nicely integrate the regular way of word generation in the decoder with the new copying mechanism which can choose sub-sequences in the input sequence and put them at proper places in the output sequence.

Last updated