论文标题
Edgenets:边缘变化的图形神经网络
EdgeNets:Edge Varying Graph Neural Networks
论文作者
论文摘要
在结构化欧几里得领域的神经网络的出色表现的推动下,近年来,人们对开发图形支持的图形和数据的神经网络引起了人们的兴趣。该图在神经网络的每个层中都用作参数化,以在节点级别捕获详细信息,并具有降低的参数和计算复杂性。遵循此基本原理,本文提出了一个通用框架,该框架通过Edgenet的概念统一了最先进的图形神经网络(GNNS)。 EDGENET是GNN架构,允许不同的节点使用不同的参数来权衡不同邻居的信息。通过将这种策略推送到相邻节点之间的更多迭代中,Edgenet学习边缘和邻居依赖性权重以捕获本地细节。这是一个通用的线性和本地操作,节点可以在一个公式下执行并包含所有现有的图形卷积神经网络(GCNN)以及图形注意网络(GATS)。 Edgenets在编写不同的GNN体系结构时,突出了特定的体系结构的优势和局限性,同时提供了指南以提高其能力,而不会损害其本地实施。一个有趣的结论是GCNN和GAT的统一 - 到目前为止,这种方法被认为是独立的。特别是,我们表明GAT是从功能中学到的图的GCNN。这种特殊化为开发了改善歧视能力的替代注意力机制的门开了。
Driven by the outstanding performance of neural networks in the structured Euclidean domain, recent years have seen a surge of interest in developing neural networks for graphs and data supported on graphs. The graph is leveraged at each layer of the neural network as a parameterization to capture detail at the node level with a reduced number of parameters and computational complexity. Following this rationale, this paper puts forth a general framework that unifies state-of-the-art graph neural networks (GNNs) through the concept of EdgeNet. An EdgeNet is a GNN architecture that allows different nodes to use different parameters to weigh the information of different neighbors. By extrapolating this strategy to more iterations between neighboring nodes, the EdgeNet learns edge- and neighbor-dependent weights to capture local detail. This is a general linear and local operation that a node can perform and encompasses under one formulation all existing graph convolutional neural networks (GCNNs) as well as graph attention networks (GATs). In writing different GNN architectures with a common language, EdgeNets highlight specific architecture advantages and limitations, while providing guidelines to improve their capacity without compromising their local implementation. An interesting conclusion is the unification of GCNNs and GATs -- approaches that have been so far perceived as separate. In particular, we show that GATs are GCNNs on a graph that is learned from the features. This particularization opens the doors to develop alternative attention mechanisms for improving discriminatory power.