论文标题

时间感知图的图形神经网络,用于实体对齐时间图图之间的实体对齐

Time-aware Graph Neural Networks for Entity Alignment between Temporal Knowledge Graphs

论文作者

Xu, Chengjin, Su, Fenglong, Lehmann, Jens

论文摘要

实体对齐旨在确定不同知识图(kg)之间的等效实体对。最近,包含时间信息的时间KGS(TKG)的可用性创造了在此类TKG中随着时间推移推理的需求。现有的基于嵌入的实体对准方法无视通常存在于许多大型公斤中的时间信息,这留下了很多改进的空间。在本文中,我们专注于将TKG之间的实体对对齐的任务,并提出一种基于图神经网络(TEA-GNN)的新型时间感知实体对准方法。我们将不同kgs的实体,关系和时间戳嵌入到矢量空间中,并使用GNN学习实体表示。为了将关系和时间信息同时纳入我们模型的GNN结构中,我们使用了一种时间感知的注意机制,该机制将不同的权重分配给不同的节点,这些节点具有从相关关系和邻里时间戳的嵌入中计算出的正交转换矩阵。多个现实世界TKG数据集的实验结果表明,由于包含时间信息,我们的方法显着超过了最新方法。

Entity alignment aims to identify equivalent entity pairs between different knowledge graphs (KGs). Recently, the availability of temporal KGs (TKGs) that contain time information created the need for reasoning over time in such TKGs. Existing embedding-based entity alignment approaches disregard time information that commonly exists in many large-scale KGs, leaving much room for improvement. In this paper, we focus on the task of aligning entity pairs between TKGs and propose a novel Time-aware Entity Alignment approach based on Graph Neural Networks (TEA-GNN). We embed entities, relations and timestamps of different KGs into a vector space and use GNNs to learn entity representations. To incorporate both relation and time information into the GNN structure of our model, we use a time-aware attention mechanism which assigns different weights to different nodes with orthogonal transformation matrices computed from embeddings of the relevant relations and timestamps in a neighborhood. Experimental results on multiple real-world TKG datasets show that our method significantly outperforms the state-of-the-art methods due to the inclusion of time information.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源