论文标题
图形卷积网络的相互教学
Mutual Teaching for Graph Convolutional Networks
论文作者
论文摘要
图形卷积网络由于其转导标签的传播而对未标记的样品产生良好的预测。由于样本具有不同的预测信心,因此我们将高信心预测作为伪标签来扩展标签集,以便选择更多样品以更新模型。我们提出了一种称为相互教学的新培训方法,即我们培训双重模型,并让他们在每批期间互相教。首先,每个网络都将所有样本前进,并选择具有高信心预测的样品。其次,每个模型都由其对等网络选择的样本更新。我们将高信心预测视为有用的知识,一个网络的有用知识通过每批模型更新来教对等网络。在相互教学中,网络的伪标签集来自其同行网络。由于我们使用了网络培训的新策略,因此性能会大大提高。广泛的实验结果表明,我们的方法在非常低的标签速率下实现了优于最先进方法的性能。
Graph convolutional networks produce good predictions of unlabeled samples due to its transductive label propagation. Since samples have different predicted confidences, we take high-confidence predictions as pseudo labels to expand the label set so that more samples are selected for updating models. We propose a new training method named as mutual teaching, i.e., we train dual models and let them teach each other during each batch. First, each network feeds forward all samples and selects samples with high-confidence predictions. Second, each model is updated by samples selected by its peer network. We view the high-confidence predictions as useful knowledge, and the useful knowledge of one network teaches the peer network with model updating in each batch. In mutual teaching, the pseudo-label set of a network is from its peer network. Since we use the new strategy of network training, performance improves significantly. Extensive experimental results demonstrate that our method achieves superior performance over state-of-the-art methods under very low label rates.