MANNs4NMT-master
Memory Augmented Neural Networks for Machine Translation Neural Turing Machine Style Attention and Memory Augmented Decoders are both extensions to the attentional encoder-decoder. We find that these extensions do not improve translation quality over the attentional encoder-decoder for the language pairs tested. The Pure MANN model is a departure from the attentional encoder-decoder. We find that the Pure MANN model performs equally to the attentional encoder-decoder for the Vietnamese to English task and ~2 BLEU worse than the attentional encoder-decoder on the Romanian to English task. attentional encoder-decoder on the Romanian to English task.
用户评论