LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Document-Level Neural Machine Translation With Recurrent Context States

Photo from wikipedia

Integrating contextual information into sentence-level neural machine translation (NMT) systems has been proven to be effective in generating fluent and coherent translations. However, taking too much context into account slows… Click to show full abstract

Integrating contextual information into sentence-level neural machine translation (NMT) systems has been proven to be effective in generating fluent and coherent translations. However, taking too much context into account slows down these systems, especially when context-aware models are applied to the decoder side. To improve efficiency, we propose a simple and fast method to encode all sentences in an arbitrary large context window. It makes contextual representations in the process of translating each sentence so that the overhead introduced by the context model is almost negligible. We experiment with our method on three widely used English-German document-level translation datasets, which obtain substantial improvements over the sentence-level baseline with almost no loss in efficiency. Moreover, our method also achieves comparable performance with previous strong context-aware baselines and speeds up the inference by $1.53\times $ . The speed-up is even larger when more contexts are taken into account. On the ContraPro pronoun translation dataset, it significantly outperforms the strong baseline.

Keywords: level neural; neural machine; level; machine translation; context; translation

Journal Title: IEEE Access
Year Published: 2023

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.