论文标题

非自动回忆翻译的多粒性优化

Multi-Granularity Optimization for Non-Autoregressive Translation

论文作者

Li, Yafu, Cui, Leyang, Yin, Yongjing, Zhang, Yue

论文摘要

尽管潜伏期较低,但由于幼稚的独立性假设而导致的非自动回归机器翻译(NAT)遭受严重的性能恶化。通过交叉渗透损失进一步加强了这一假设,这鼓励了该假设与令牌的参考令牌之间的严格匹配。为了减轻此问题,我们建议对NAT进行多粒性优化,该功能收集了有关各种粒度的翻译段的模型行为,并整合了反向传播的反馈。在四个WMT基准测试的实验表明,该提出的方法显着优于训练有跨透镜损失的基线模型,并在WMT'16 EN-RO上取得了最佳性能,并且在WMT'14 EN-DE上获得了完全非AutoreGression-AutoreGressive Rission的竞争性结果。

Despite low latency, non-autoregressive machine translation (NAT) suffers severe performance deterioration due to the naive independence assumption. This assumption is further strengthened by cross-entropy loss, which encourages a strict match between the hypothesis and the reference token by token. To alleviate this issue, we propose multi-granularity optimization for NAT, which collects model behaviors on translation segments of various granularities and integrates feedback for backpropagation. Experiments on four WMT benchmarks show that the proposed method significantly outperforms the baseline models trained with cross-entropy loss, and achieves the best performance on WMT'16 En-Ro and highly competitive results on WMT'14 En-De for fully non-autoregressive translation.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源