论文标题
从歧视到世代:知识图完成,具有生成变压器
From Discrimination to Generation: Knowledge Graph Completion with Generative Transformer
论文作者
论文摘要
知识图完成旨在解决扩大千克群体缺少三元的问题。在本文中,我们提供了一种方法GenKGC,该方法将知识图的完成转换为使用预训练的语言模型的顺序到序列生成任务。我们进一步介绍了关系引导的演示和实体感知的层次解码,以更好地表示学习和快速推断。三个数据集的实验结果表明,与先前使用预训练的语言模型的方法相比,我们的方法可以比基线获得更好或可比的性能,并获得更快的推理速度。我们还发布了用于研究目的的新大型中国知识图数据集Aliopenkg500。代码和数据集可在https://github.com/zjunlp/promptkg/tree/main/main/genkgc中找到。
Knowledge graph completion aims to address the problem of extending a KG with missing triples. In this paper, we provide an approach GenKGC, which converts knowledge graph completion to sequence-to-sequence generation task with the pre-trained language model. We further introduce relation-guided demonstration and entity-aware hierarchical decoding for better representation learning and fast inference. Experimental results on three datasets show that our approach can obtain better or comparable performance than baselines and achieve faster inference speed compared with previous methods with pre-trained language models. We also release a new large-scale Chinese knowledge graph dataset AliopenKG500 for research purpose. Code and datasets are available in https://github.com/zjunlp/PromptKG/tree/main/GenKGC.