论文标题

多语言神经机器翻译的全面调查

A Comprehensive Survey of Multilingual Neural Machine Translation

论文作者

Dabre, Raj, Chu, Chenhui, Kunchukuttan, Anoop

论文摘要

我们介绍了一项关于多语言神经机器翻译(MNMT)的调查,该调查在近年来获得了很多吸引力。由于翻译知识转移(转移学习),MNMT在改善翻译质量方面非常有用。 MNMT比其统计机器翻译对应物更有前途和有趣,因为端到端建模和分布式表示形式开放了用于机器翻译研究的新途径。已经提出了许多方法,以利用多语言平行语料库来提高翻译质量。但是,缺乏全面的调查使得难以确定哪些方法有希望,因此值得进一步探索。在本文中,我们介绍了有关MNMT现有文献的深入调查。我们首先根据中央用例对各种方法进行分类,然后根据资源方案,基本建模原理,核心发行和挑战进一步对它们进行分类。尽可能,我们通过将它们相互比较来解决几种技术的优势和劣势。我们还讨论了MNMT研究可能采取的未来方向。本文针对NMT的初学者和专家。我们希望本文将成为对MNMT感兴趣的研究人员和工程师的新想法的起点。

We present a survey on multilingual neural machine translation (MNMT), which has gained a lot of traction in the recent years. MNMT has been useful in improving translation quality as a result of translation knowledge transfer (transfer learning). MNMT is more promising and interesting than its statistical machine translation counterpart because end-to-end modeling and distributed representations open new avenues for research on machine translation. Many approaches have been proposed in order to exploit multilingual parallel corpora for improving translation quality. However, the lack of a comprehensive survey makes it difficult to determine which approaches are promising and hence deserve further exploration. In this paper, we present an in-depth survey of existing literature on MNMT. We first categorize various approaches based on their central use-case and then further categorize them based on resource scenarios, underlying modeling principles, core-issues and challenges. Wherever possible we address the strengths and weaknesses of several techniques by comparing them with each other. We also discuss the future directions that MNMT research might take. This paper is aimed towards both, beginners and experts in NMT. We hope this paper will serve as a starting point as well as a source of new ideas for researchers and engineers interested in MNMT.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源