论文标题

排名信息噪声对比估计:通过排名阳性来提高对比度学习

Ranking Info Noise Contrastive Estimation: Boosting Contrastive Learning via Ranked Positives

论文作者

Hoffmann, David T., Behrmann, Nadine, Gall, Juergen, Brox, Thomas, Noroozi, Mehdi

论文摘要

本文介绍了排名信息噪声对比估计(Rince),这是Infonce损失家族的新成员,可保留排名阳性样本的排名。与标准的Infonce损失相反,标准的Infonce损失需要将训练对进行严格的二元分离分为相似和不同的样本,Rince可以利用有关学习相应嵌入空间的相似性排名的信息。我们表明,与标准Infonce相比,建议至少可以获得嘈杂的排名信息,或者在阳性和负面的定义变得模糊时,就会学习有利的嵌入。我们为有其他超类标签和嘈杂的相似性得分提供了监督分类任务的证明。此外,我们证明,通过对视频中无监督的表示形式学习的实验,也可以将Rince应用于无监督的培训。特别是,嵌入得出的分类精度,检索率更高,并且在分布外检测中的性能要比标准信息损失更好。

This paper introduces Ranking Info Noise Contrastive Estimation (RINCE), a new member in the family of InfoNCE losses that preserves a ranked ordering of positive samples. In contrast to the standard InfoNCE loss, which requires a strict binary separation of the training pairs into similar and dissimilar samples, RINCE can exploit information about a similarity ranking for learning a corresponding embedding space. We show that the proposed loss function learns favorable embeddings compared to the standard InfoNCE whenever at least noisy ranking information can be obtained or when the definition of positives and negatives is blurry. We demonstrate this for a supervised classification task with additional superclass labels and noisy similarity scores. Furthermore, we show that RINCE can also be applied to unsupervised training with experiments on unsupervised representation learning from videos. In particular, the embedding yields higher classification accuracy, retrieval rates and performs better in out-of-distribution detection than the standard InfoNCE loss.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源