论文标题
不太少,不是太多:图表的理论分析(超过)平滑
Not too little, not too much: a theoretical analysis of graph (over)smoothing
论文作者
论文摘要
我们用\ emph {平均聚集}分析图平滑,其中每个节点依次接收其邻居特征的平均值。确实,已经很快观察到,通常遵循一些带有消息通话的变体(MP)的图形神经网络(GNN)可能会受到过度厚度的现象的影响:通过执行太多的MP,节点特征倾向于将其转化为非信息性限制。在平均聚集的情况下,对于连接的图,节点特征在整个图中变得恒定。在频谱的另一端,很明显,某些MP弹是必要的,但是现有的分析并未立即表现出这两种现象:有益的``有限''平滑'平滑和过度厚度。在本文中,我们考虑了简化的线性GNN,并严格分析了两个示例,在这些示例中,在过度厚度启动之前,有限数量的平均聚合步骤可改善学习性能。我们考虑了潜在空间随机图模型,其中节点是潜在变量的部分观察,图形包含它们之间的成对关系。我们表明,通过两个现象,图形平滑恢复了一些丢失的信息,直到某个现象:图形平滑缩小数据比主要数据更快地缩小了数据中的非主要方向,这对于回归很有用,并且在社区内缩小了比倒塌更快地缩小社区的节点,从而改善了分类。
We analyze graph smoothing with \emph{mean aggregation}, where each node successively receives the average of the features of its neighbors. Indeed, it has quickly been observed that Graph Neural Networks (GNNs), which generally follow some variant of Message-Passing (MP) with repeated aggregation, may be subject to the oversmoothing phenomenon: by performing too many rounds of MP, the node features tend to converge to a non-informative limit. In the case of mean aggregation, for connected graphs, the node features become constant across the whole graph. At the other end of the spectrum, it is intuitively obvious that some MP rounds are necessary, but existing analyses do not exhibit both phenomena at once: beneficial ``finite'' smoothing and oversmoothing in the limit. In this paper, we consider simplified linear GNNs, and rigorously analyze two examples for which a finite number of mean aggregation steps provably improves the learning performance, before oversmoothing kicks in. We consider a latent space random graph model, where node features are partial observations of the latent variables and the graph contains pairwise relationships between them. We show that graph smoothing restores some of the lost information, up to a certain point, by two phenomenon: graph smoothing shrinks non-principal directions in the data faster than principal ones, which is useful for regression, and shrinks nodes within communities faster than they collapse together, which improves classification.