Target-Side Augmentation for Document-Level Machine Translation

Guangsheng Bao, Zhiyang Teng, Yue Zhang


Abstract
Document-level machine translation faces the challenge of data sparsity due to its long input length and a small amount of training data, increasing the risk of learning spurious patterns. To address this challenge, we propose a target-side augmentation method, introducing a data augmentation (DA) model to generate many potential translations for each source document. Learning on these wider range translations, an MT model can learn a smoothed distribution, thereby reducing the risk of data sparsity. We demonstrate that the DA model, which estimates the posterior distribution, largely improves the MT performance, outperforming the previous best system by 2.30 s-BLEU on News and achieving new state-of-the-art on News and Europarl benchmarks.
Anthology ID:
2023.acl-long.599
Original:
2023.acl-long.599v1
Version 2:
2023.acl-long.599v2
Version 3:
2023.acl-long.599v3
Volume:
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
10725–10742
Language:
URL:
https://aclanthology.org/2023.acl-long.599
DOI:
10.18653/v1/2023.acl-long.599
Bibkey:
Cite (ACL):
Guangsheng Bao, Zhiyang Teng, and Yue Zhang. 2023. Target-Side Augmentation for Document-Level Machine Translation. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 10725–10742, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Target-Side Augmentation for Document-Level Machine Translation (Bao et al., ACL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.acl-long.599.pdf
Video:
 https://aclanthology.org/2023.acl-long.599.mp4