Загрузка страницы

Mirror-Generative Neural Machine Translation | NLP Journal Club

Paper link: https://openreview.net/forum?id=HkxQRTNYPH

Abstract: Training neural machine translation models (NMT) requires a large amount of parallel corpus, which is scarce for many language pairs. However, raw non-parallel corpora are often easy to obtain. Existing approaches have not exploited the full potential of non-parallel bilingual data either in training or decoding. In this paper, we propose the mirror-generative NMT (MGNMT), a single unified architecture that simultaneously integrates the source to target translation model, the target to source translation model, and two language models. Both translation models and language models share the same latent semantic space, therefore both translation directions can learn from non-parallel data more effectively. Besides, the translation models and language models can collaborate together during decoding. Our experiments show that the proposed MGNMT consistently outperforms existing approaches in all a variety of scenarios and language pairs, including resource-rich and low-resource languages.

Keywords: neural machine translation, generative model, mirror

Subscribe for some more upcoming NLP videos!

Видео Mirror-Generative Neural Machine Translation | NLP Journal Club канала The NLP Lab
Показать
Комментарии отсутствуют
Введите заголовок:

Введите адрес ссылки:

Введите адрес видео с YouTube:

Зарегистрируйтесь или войдите с
Информация о видео
12 марта 2020 г. 19:10:02
00:09:58
Яндекс.Метрика