论文标题

通过类相似性控制重新访问很少的射击活动检测

Revisiting Few-shot Activity Detection with Class Similarity Control

论文作者

Xu, Huijuan, Sun, Ximeng, Tzeng, Eric, Das, Abir, Saenko, Kate, Darrell, Trevor

论文摘要

现实世界中的许多有趣的事件很少会使原告的机器学习准备就绪视频成为稀有性。因此,需要从几个示例中学习的时间活动检测模型。在本文中,我们提出了一个基于提案回归的概念上简单而新颖但新颖的框架,用于几次时间活动检测,该框架在未修剪视频中检测活动的开始和结束时间。我们的模型是端到端训练的,考虑到少量活动和未修剪的测试视频之间的帧速率差异,并且可以从其他几个示例中受益。我们在三个大型基准测试中进行时间活动检测(ActivityNet1.2,ActivityNet1.3和Thumos14数据集)进行了尝试。我们还研究了与为预先介绍视频分类主链的活动的不同重叠的性能的影响,并提出了该领域中未来作品的纠正措施。我们的代码将提供。

Many interesting events in the real world are rare making preannotated machine learning ready videos a rarity in consequence. Thus, temporal activity detection models that are able to learn from a few examples are desirable. In this paper, we present a conceptually simple and general yet novel framework for few-shot temporal activity detection based on proposal regression which detects the start and end time of the activities in untrimmed videos. Our model is end-to-end trainable, takes into account the frame rate differences between few-shot activities and untrimmed test videos, and can benefit from additional few-shot examples. We experiment on three large scale benchmarks for temporal activity detection (ActivityNet1.2, ActivityNet1.3 and THUMOS14 datasets) in a few-shot setting. We also study the effect on performance of different amount of overlap with activities used to pretrain the video classification backbone and propose corrective measures for future works in this domain. Our code will be made available.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源