论文标题

T2-GNN:图形的图形神经网络,具有不完整的特征和结构,通过教师蒸馏

T2-GNN: Graph Neural Networks for Graphs with Incomplete Features and Structure via Teacher-Student Distillation

论文作者

Huo, Cuiying, Jin, Di, Li, Yawen, He, Dongxiao, Yang, Yu-Bin, Wu, Lingfei

论文摘要

图形神经网络(GNNS)已成为一种盛行的技术,用于解决图形数据的各种分析任务。 GNN的显着性能的一个关键前提依赖于完整且值得信赖的初始图形描述(即节点特征和图形结构),这通常不满足,因为由于各种不可避免的因素,实际图形通常是不完整的。特别是,当两个节点特征和图形结构同时不完整时,GNN面临更大的挑战。现有方法侧重于功能完成或结构完成。他们通常依靠特征和结构之间的匹配关系,或采用节点表示和特征(或结构)完成的联合学习,以期实现相互利益。但是,最近的研究证实,特征和结构之间的相互干扰导致GNN性能的降解。当特征和结构不完整时,由于缺失随机性引起的特征和结构之间的不匹配加剧了两者之间的干扰,这可能会触发不正确的完成,从而对节点表示产生负面影响。为此,在本文中,我们提出了一个基于教师蒸馏的一般GNN框架,以提高GNN在不完整图上的性能,即T2-GNN。为了避免特征和结构之间的干扰,我们分别设计特征级别和结构级教师模型,以通过蒸馏为学生模型(基本GNN,例如GCN)提供有针对性的指导。然后,我们设计了两种个性化方法,以获得训练有素的特征和结构教师。为了确保教师模型的知识全面有效地蒸馏到学生模型中,我们进一步提出了双重蒸馏模式,以使学生能够获得尽可能多的专业知识。

Graph Neural Networks (GNNs) have been a prevailing technique for tackling various analysis tasks on graph data. A key premise for the remarkable performance of GNNs relies on complete and trustworthy initial graph descriptions (i.e., node features and graph structure), which is often not satisfied since real-world graphs are often incomplete due to various unavoidable factors. In particular, GNNs face greater challenges when both node features and graph structure are incomplete at the same time. The existing methods either focus on feature completion or structure completion. They usually rely on the matching relationship between features and structure, or employ joint learning of node representation and feature (or structure) completion in the hope of achieving mutual benefit. However, recent studies confirm that the mutual interference between features and structure leads to the degradation of GNN performance. When both features and structure are incomplete, the mismatch between features and structure caused by the missing randomness exacerbates the interference between the two, which may trigger incorrect completions that negatively affect node representation. To this end, in this paper we propose a general GNN framework based on teacher-student distillation to improve the performance of GNNs on incomplete graphs, namely T2-GNN. To avoid the interference between features and structure, we separately design feature-level and structure-level teacher models to provide targeted guidance for student model (base GNNs, such as GCN) through distillation. Then we design two personalized methods to obtain well-trained feature and structure teachers. To ensure that the knowledge of the teacher model is comprehensively and effectively distilled to the student model, we further propose a dual distillation mode to enable the student to acquire as much expert knowledge as possible.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源