论文标题
通过自适应边距损失增强几乎没有学习的学习
Boosting Few-Shot Learning With Adaptive Margin Loss
论文作者
论文摘要
近年来,很少有射击学习(FSL)吸引了越来越多的关注,但由于学习从几个示例中概括的内在困难而始终充满挑战。本文提出了一种自适应余量原则,以提高基于公制的元学习方法的概括能力,以解决一些问题。具体而言,我们首先开发了相关的添加距离损失,其中每对类之间的语义相似性被认为是将嵌入空间中的样本与相似类中的样本分开。此外,我们将所有类别之间的语义上下文纳入了采样的培训任务中,并开发了与任务相关的加性损失,以更好地区分样本与不同类别。我们的自适应边距方法可以轻松地扩展到更现实的广义FSL设置。广泛的实验表明,在标准FSL和广义FSL设置下,所提出的方法可以提高基于指标的元学习方法的性能。
Few-shot learning (FSL) has attracted increasing attention in recent years but remains challenging, due to the intrinsic difficulty in learning to generalize from a few examples. This paper proposes an adaptive margin principle to improve the generalization ability of metric-based meta-learning approaches for few-shot learning problems. Specifically, we first develop a class-relevant additive margin loss, where semantic similarity between each pair of classes is considered to separate samples in the feature embedding space from similar classes. Further, we incorporate the semantic context among all classes in a sampled training task and develop a task-relevant additive margin loss to better distinguish samples from different classes. Our adaptive margin method can be easily extended to a more realistic generalized FSL setting. Extensive experiments demonstrate that the proposed method can boost the performance of current metric-based meta-learning approaches, under both the standard FSL and generalized FSL settings.