NLP-KG
Semantic Search

Publication:

Selecting Informative Context Sentence by Forced Back-Translation

Ryuichiro KimuraShohei IidaHongyi CuiPo-Hsuan HungT. UtsuroM. Nagata • @Machine Translation Summit • 01 January 2019

TLDR: It is found that, if one could appropriately select the most informative context sentence for a given input source sentence, it could boost translation accuracy as much as approximately 10 BLEU points.

Citations: 5
Abstract: As one of the contributions of this paper, this paper first explores the upper bound of context-based neural machine translation and attempt to utilize previously un-used context information. We found that, if we could appropriately select the most informative context sentence for a given input source sentence, we could boost translation accuracy as much as approximately 10 BLEU points. This paper next explores a criterion to select the most informative context sentences that give the highest BLEU score. Applying the proposed criterion, context sentences that yield the highest forced back-translation probability when back-translating into the source sentence are selected. Experimental results with Japanese and English parallel sentences from the OpenSubtitles2018 corpus demonstrate that, when the context length of five preceding and five subsequent sentences are examined, the proposed approach achieved significant improvements of 0.74 (Japanese to English) and 1.14 (English to Japanese) BLEU scores compared to the baseline 2-to-2 model, where the oracle translation achieved upper bounds improvements of 5.88 (Japanese to English) and 9.10 (English to Japanese) BLEU scores.

Related Fields of Study

loading

Citations

Sort by
Previous
Next

Showing results 1 to 0 of 0

Previous
Next

References

Sort by
Previous
Next

Showing results 1 to 0 of 0

Previous
Next