论文标题
铂金:使用superodular共同信息的半监督模型不可知的元学习
PLATINUM: Semi-Supervised Model Agnostic Meta-Learning using Submodular Mutual Information
论文作者
论文摘要
几个射击分类(FSC)需要使用几个(通常为1-5个)数据点的培训模型。事实证明,元学习能够通过培训其他各种分类任务来学习FSC的参数化模型。在这项工作中,我们提出了铂(使用suppodular互信息的半监督模型不可思议的元学习),这是一种新型的半监督模型不合理的元学习框架,该框架使用了supporular互相关框架,该框架使用了次模拟信息(SMI)功能来促进FSC的性能。在元训练期间,使用SMI函数在内部和外循环中利用铂金的数据,并为元检验获得更丰富的元学习参数化。我们在两种情况下研究铂的性能-1)未标记的数据点属于与某个发作的标记集的同一类,以及2)在存在不属于标记的集合的分布外类的地方。我们在Miniimagenet,Tieredimagenet和几乎没有Shot-CIFAR100数据集的各种设置上评估了我们的方法。我们的实验表明,铂的表现优于MAML和半监督的方法,例如半监督FSC的pseduo-Labeling,尤其是对于每个类别的标记示例比例很小。
Few-shot classification (FSC) requires training models using a few (typically one to five) data points per class. Meta learning has proven to be able to learn a parametrized model for FSC by training on various other classification tasks. In this work, we propose PLATINUM (semi-suPervised modeL Agnostic meTa-learnIng usiNg sUbmodular Mutual information), a novel semi-supervised model agnostic meta-learning framework that uses the submodular mutual information (SMI) functions to boost the performance of FSC. PLATINUM leverages unlabeled data in the inner and outer loop using SMI functions during meta-training and obtains richer meta-learned parameterizations for meta-test. We study the performance of PLATINUM in two scenarios - 1) where the unlabeled data points belong to the same set of classes as the labeled set of a certain episode, and 2) where there exist out-of-distribution classes that do not belong to the labeled set. We evaluate our method on various settings on the miniImageNet, tieredImageNet and Fewshot-CIFAR100 datasets. Our experiments show that PLATINUM outperforms MAML and semi-supervised approaches like pseduo-labeling for semi-supervised FSC, especially for small ratio of labeled examples per class.