4.6 Article

Neural Machine Translation With Sentence-Level Topic Context

Journal

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TASLP.2019.2937190

Keywords

Sentence-level Context; Latent Topic Representation; Convolutional Neural Network; Neural Machine Translation

Funding

  1. National Key Technologies R&D Program of China [2017YFB1002102]
  2. JSPS KAKENHI [19H05660]
  3. JSPS [19K20354]
  4. NICT tenure-track researcher startup fund Toward Intelligent Machine Translation
  5. Grants-in-Aid for Scientific Research [19K20354, 19H05660] Funding Source: KAKEN

Ask authors/readers for more resources

Traditional neural machine translation (NMT) methods use theword-level context to predict target language translation while neglecting the sentence-level context, which has been shown to be beneficial for translation prediction in statistical machine translation. This paper represents the sentence-level context as latent topic representations by using a convolution neural network, and designs a topic attention to integrate source sentence-level topic context information into both attention-based and Transformerbased NMT. In particular, our method can improve the performance of NMT by modeling source topics and translations jointly. Experiments on the large-scale LDC Chinese-to-English translation tasks and WMT'14 English-to-German translation tasks show that the proposed approach can achieve significant improvements compared with baseline systems.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available