Journal
IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING
Volume 27, Issue 12, Pages 1970-1984Publisher
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TASLP.2019.2937190
Keywords
Sentence-level Context; Latent Topic Representation; Convolutional Neural Network; Neural Machine Translation
Categories
Funding
- National Key Technologies R&D Program of China [2017YFB1002102]
- JSPS KAKENHI [19H05660]
- JSPS [19K20354]
- NICT tenure-track researcher startup fund Toward Intelligent Machine Translation
- Grants-in-Aid for Scientific Research [19K20354, 19H05660] Funding Source: KAKEN
Ask authors/readers for more resources
Traditional neural machine translation (NMT) methods use theword-level context to predict target language translation while neglecting the sentence-level context, which has been shown to be beneficial for translation prediction in statistical machine translation. This paper represents the sentence-level context as latent topic representations by using a convolution neural network, and designs a topic attention to integrate source sentence-level topic context information into both attention-based and Transformerbased NMT. In particular, our method can improve the performance of NMT by modeling source topics and translations jointly. Experiments on the large-scale LDC Chinese-to-English translation tasks and WMT'14 English-to-German translation tasks show that the proposed approach can achieve significant improvements compared with baseline systems.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available