论文标题

圈子损失:一对相似性优化的统一视角

Circle Loss: A Unified Perspective of Pair Similarity Optimization

论文作者

Sun, Yifan, Cheng, Changmao, Zhang, Yuhan, Zhang, Chi, Zheng, Liang, Wang, Zhongdao, Wei, Yichen

论文摘要

本文提供了深度功能学习的一对相似性优化观点,旨在最大化课堂内相似性$ s_p $,并最大程度地减少类中的相似性$ s_n $。我们发现大多数损失功能,包括三胞胎损失和SoftMax和跨凝结损失,将$ s_n $和$ s_p $嵌入相似性对,并寻求减少$(S_N-S_P)$。这种优化方式是不灵活的,因为每个相似性得分的惩罚强度都限制为相等。我们的直觉是,如果相似性得分远离最佳分数,则应强调。为此,我们只需重新权重每个相似性即可突出显示较不优化的相似性得分。它导致圆损失,这是由于其圆形决策边界而命名的。 Circle损失具有针对两种元素深度学习方法的统一公式,即使用类级标签和配对标签学习。从分析上,我们表明,与优化$(S_N-S_P)$的损失功能相比,圆损失为更确定的收敛目标提供了更灵活的优化方法。在实验上,我们证明了圆损失在各种深度学习任务上的优越性。在面对面的识别中,人重新识别以及几个细粒度的图像检索数据集,所达到的性能与最先进的状态相当。

This paper provides a pair similarity optimization viewpoint on deep feature learning, aiming to maximize the within-class similarity $s_p$ and minimize the between-class similarity $s_n$. We find a majority of loss functions, including the triplet loss and the softmax plus cross-entropy loss, embed $s_n$ and $s_p$ into similarity pairs and seek to reduce $(s_n-s_p)$. Such an optimization manner is inflexible, because the penalty strength on every single similarity score is restricted to be equal. Our intuition is that if a similarity score deviates far from the optimum, it should be emphasized. To this end, we simply re-weight each similarity to highlight the less-optimized similarity scores. It results in a Circle loss, which is named due to its circular decision boundary. The Circle loss has a unified formula for two elemental deep feature learning approaches, i.e. learning with class-level labels and pair-wise labels. Analytically, we show that the Circle loss offers a more flexible optimization approach towards a more definite convergence target, compared with the loss functions optimizing $(s_n-s_p)$. Experimentally, we demonstrate the superiority of the Circle loss on a variety of deep feature learning tasks. On face recognition, person re-identification, as well as several fine-grained image retrieval datasets, the achieved performance is on par with the state of the art.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源