论文标题

使用免费对象段学习长尾实例细分

Learning with Free Object Segments for Long-Tailed Instance Segmentation

论文作者

Zhang, Cheng, Pan, Tai-Yu, Chen, Tianle, Zhong, Jike, Fu, Wenjin, Chao, Wei-Lun

论文摘要

在复杂场景中为大量课程建立实例细分模型的一个基本挑战是缺乏培训示例,尤其是对于稀有物体。在本文中,我们探讨了增加培训示例而没有费力的数据收集和注释的可能性。我们发现,根据两个见解,可以从中心图像中自由获得实例段的大量段:(i)以对象为中心的图像通常在简单的背景中包含一个显着对象; (ii)来自同一类的对象通常与背景具有相似的外观或相似的对比度。在这些见解的动机上,我们提出了一个简单且可扩展的框架免费的框架,用于提取和利用这些“自由”对象前景细分市场,以促进长尾实例细分中的模型培训。具体而言,我们研究了同一类中心的以对象为中心图像的相似性,以提出前景实例的候选段,然后是细分质量的新排名。然后,可以使用由此产生的高质量对象段来通过将段和粘贴到原始培训图像上来增加现有的长尾数据集,例如,例如,将细分片复制和粘贴到原始的培训图像上。广泛的实验表明,FreeSeg在强基地的基础上产生了可观的改进,并实现了细分稀有物体类别的最先进的准确性。

One fundamental challenge in building an instance segmentation model for a large number of classes in complex scenes is the lack of training examples, especially for rare objects. In this paper, we explore the possibility to increase the training examples without laborious data collection and annotation. We find that an abundance of instance segments can potentially be obtained freely from object-centric images, according to two insights: (i) an object-centric image usually contains one salient object in a simple background; (ii) objects from the same class often share similar appearances or similar contrasts to the background. Motivated by these insights, we propose a simple and scalable framework FreeSeg for extracting and leveraging these "free" object foreground segments to facilitate model training in long-tailed instance segmentation. Concretely, we investigate the similarity among object-centric images of the same class to propose candidate segments of foreground instances, followed by a novel ranking of segment quality. The resulting high-quality object segments can then be used to augment the existing long-tailed datasets, e.g., by copying and pasting the segments onto the original training images. Extensive experiments show that FreeSeg yields substantial improvements on top of strong baselines and achieves state-of-the-art accuracy for segmenting rare object categories.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源