论文标题
自我监管的原型转移学习,用于几次分类
Self-Supervised Prototypical Transfer Learning for Few-Shot Classification
论文作者
论文摘要
几次学习中的大多数方法都取决于(预 - )培训期间与目标任务域相关的昂贵注释数据。最近,无监督的元学习方法已交换了减少几次分类性能的注释要求。同时,在具有现实域转移的设置中,已证明共同的转移学习表现优于监督的元学习。在这些见解和自我监督学习的进步的基础上,我们提出了一种转移学习方法,该方法构建了一种指标嵌入,该指标嵌入了未标记的原型样本及其增强量。通过总结班级簇和微调,这种预训练的嵌入是几次分类的起点。我们证明,我们在Mini-ImageNet数据集中几乎没有射击任务上,我们的自我监管的原型转移学习方法原始转移优于最先进的无监督元学习方法。在与域移动的几次射击实验中,我们的方法甚至具有与监督方法相当的性能,但要求标签的数量级较少。
Most approaches in few-shot learning rely on costly annotated data related to the goal task domain during (pre-)training. Recently, unsupervised meta-learning methods have exchanged the annotation requirement for a reduction in few-shot classification performance. Simultaneously, in settings with realistic domain shift, common transfer learning has been shown to outperform supervised meta-learning. Building on these insights and on advances in self-supervised learning, we propose a transfer learning approach which constructs a metric embedding that clusters unlabeled prototypical samples and their augmentations closely together. This pre-trained embedding is a starting point for few-shot classification by summarizing class clusters and fine-tuning. We demonstrate that our self-supervised prototypical transfer learning approach ProtoTransfer outperforms state-of-the-art unsupervised meta-learning methods on few-shot tasks from the mini-ImageNet dataset. In few-shot experiments with domain shift, our approach even has comparable performance to supervised methods, but requires orders of magnitude fewer labels.