论文标题

PTN:半监督几次学习的泊松转移网络

PTN: A Poisson Transfer Network for Semi-supervised Few-shot Learning

论文作者

Huang, Huaxi, Zhang, Junjie, Zhang, Jian, Wu, Qiang, Xu, Chang

论文摘要

半监督几次学习(SSFSL)中的困境是最大化额外未标记的数据的价值,以增强少量学习者。在本文中,我们提出了一个泊松转移网络(PTN),以从两个方面从两个方面挖掘SSFSL的未标记信息。首先,Poisson Merriman Bence Osher(MBO)模型为标记和未标记示例之间的通信建立了桥梁。该模型在标签的消息过程中比传统的基于图的SSFSL方法是一种更稳定和信息丰富的分类器。其次,采用额外的未标记样本将知识从基类转移到通过对比学习的新课程。具体而言,我们强迫增强的正对关闭,同时推动负面距离。我们的对比转移方案隐式地学习了新型级别的嵌入,以减轻少数标记数据的过度拟合问题。因此,我们可以减轻新颖类中嵌入通用性的变性。广泛的实验表明,PTN的表现优于Miniimagenet和Tieredimagenet基准数据集上的最新型号和SSFSL模型。

The predicament in semi-supervised few-shot learning (SSFSL) is to maximize the value of the extra unlabeled data to boost the few-shot learner. In this paper, we propose a Poisson Transfer Network (PTN) to mine the unlabeled information for SSFSL from two aspects. First, the Poisson Merriman Bence Osher (MBO) model builds a bridge for the communications between labeled and unlabeled examples. This model serves as a more stable and informative classifier than traditional graph-based SSFSL methods in the message-passing process of the labels. Second, the extra unlabeled samples are employed to transfer the knowledge from base classes to novel classes through contrastive learning. Specifically, we force the augmented positive pairs close while push the negative ones distant. Our contrastive transfer scheme implicitly learns the novel-class embeddings to alleviate the over-fitting problem on the few labeled data. Thus, we can mitigate the degeneration of embedding generality in novel classes. Extensive experiments indicate that PTN outperforms the state-of-the-art few-shot and SSFSL models on miniImageNet and tieredImageNet benchmark datasets.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源