论文标题

通过内容匹配的约束来迈向忠实的神经桌面到文本

Towards Faithful Neural Table-to-Text Generation with Content-Matching Constraints

论文作者

Wang, Zhenyi, Wang, Xiaoyang, An, Bang, Yu, Dong, Chen, Changyou

论文摘要

从知识库中的文本生成旨在将知识三元转换为自然语言描述。大多数现有的方法忽略了生成的文本描述与原始表之间的忠诚,从而导致生成的信息超出了表格的内容。在本文中,我们首次提出了一个基于变压器的新型生成框架来实现目标。我们实施忠诚的方法中的核心技术包括新的表文本最佳传输匹配损失和基于变压器模型的表嵌入相似性损失。此外,为了评估忠诚,我们提出了一个专门针对桌面到文本生成问题的新自动指标。在实验中,我们还对模型的每个组成部分提供了详细的分析。自动和人类评估表明,我们的框架可以大大优于最先进的余量。

Text generation from a knowledge base aims to translate knowledge triples to natural language descriptions. Most existing methods ignore the faithfulness between a generated text description and the original table, leading to generated information that goes beyond the content of the table. In this paper, for the first time, we propose a novel Transformer-based generation framework to achieve the goal. The core techniques in our method to enforce faithfulness include a new table-text optimal-transport matching loss and a table-text embedding similarity loss based on the Transformer model. Furthermore, to evaluate faithfulness, we propose a new automatic metric specialized to the table-to-text generation problem. We also provide detailed analysis on each component of our model in our experiments. Automatic and human evaluations show that our framework can significantly outperform state-of-the-art by a large margin.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源