论文标题

与信息消除信息的联合提取实体和关系

Joint Extraction of Entity and Relation with Information Redundancy Elimination

论文作者

Shen, Yuanhao, Han, Jungang

论文摘要

为了解决实体和关系提取模型的冗余信息和重叠关系的问题,我们提出了一个联合提取模型。该模型可以直接提取多对相关实体,而无需生成无关的冗余信息。我们还提出了一个名为encoder-lstm的经常性神经网络,该网络可以增强复发单元对句子建模的能力。具体而言,联合模型包括三个子模块:命名实体识别子模块由预训练的语言模型和LSTM解码器层组成,即实体对提取子模块,该材料提取子模块使用Encoder-LSTM网络来建模相关实体对之间的顺序关系,以及关系分类子模块,包括注意机制。我们在公共数据集ADE和Conll04上进行了实验,以评估模型的有效性。结果表明,所提出的模型在实体和关系提取的任务中实现了良好的性能,并且可以大大减少冗余信息的量。

To solve the problem of redundant information and overlapping relations of the entity and relation extraction model, we propose a joint extraction model. This model can directly extract multiple pairs of related entities without generating unrelated redundant information. We also propose a recurrent neural network named Encoder-LSTM that enhances the ability of recurrent units to model sentences. Specifically, the joint model includes three sub-modules: the Named Entity Recognition sub-module consisted of a pre-trained language model and an LSTM decoder layer, the Entity Pair Extraction sub-module which uses Encoder-LSTM network to model the order relationship between related entity pairs, and the Relation Classification sub-module including Attention mechanism. We conducted experiments on the public datasets ADE and CoNLL04 to evaluate the effectiveness of our model. The results show that the proposed model achieves good performance in the task of entity and relation extraction and can greatly reduce the amount of redundant information.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源