Prompting Neural Machine Translation with Translation Memories

نویسندگان

چکیده

Improving machine translation (MT) systems with memories (TMs) is of great interest to practitioners in the MT community. However, previous approaches require either a significant update model architecture and/or additional training efforts make models well-behaved when TMs are taken as input. In this paper, we present simple but effective method introduce into neural (NMT) systems. Specifically, treat prompts NMT at test time, leave process unchanged. The result slight an existing system, which can be implemented few hours by anyone who familiar NMT. Experimental results on several datasets demonstrate that our system significantly outperforms strong baselines.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Combining Statistical Machine Translation and Translation Memories with Domain Adaptation

Since the emergence of translation memory software, translation companies and freelance translators have been accumulating translated text for various languages and domains. This data has the potential of being used for training domain-specific machine translation systems for corporate or even personal use. But while the resulting systems usually perform well in translating domain-specific lang...

متن کامل

Linking Translation Memories with Example-Based Machine Translation

The paper reports on experiments which compare the translation outcome of three corpus-based MT systems, a string-based translation memory (STM), a lexeme-based translation memory (LTM) and the examplebased machine translation (EBMT) system EDGAR. We use a fully automatic evaluation method to compare the outcome of each MT system and discuss the results. We investigate the benefits for the link...

متن کامل

Neural Name Translation Improves Neural Machine Translation

In order to control computational complexity, neural machine translation (NMT) systems convert all rare words outside the vocabulary into a single unk symbol. Previous solution (Luong et al., 2015) resorts to use multiple numbered unks to learn the correspondence between source and target rare words. However, testing words unseen in the training corpus cannot be handled by this method. And it a...

متن کامل

Neural Machine Translation with Reconstruction

Although end-to-end Neural Machine Translation (NMT) has achieved remarkable progress in the past two years, it suffers from a major drawback: translations generated by NMT systems often lack of adequacy. It has been widely observed that NMT tends to repeatedly translate some source words while mistakenly ignoring other words. To alleviate this problem, we propose a novel encoder-decoder-recons...

متن کامل

Can neural machine translation do simultaneous translation?

We investigate the potential of attention-based neural machine translation in simultaneous translation. We introduce a novel decoding algorithm, called simultaneous greedy decoding, that allows an existing neural machine translation model to begin translating before a full source sentence is received. This approach is unique from previous works on simultaneous translation in that segmentation a...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Proceedings of the ... AAAI Conference on Artificial Intelligence

سال: 2023

ISSN: ['2159-5399', '2374-3468']

DOI: https://doi.org/10.1609/aaai.v37i11.26585