论文标题

使用一开始的注意来增强3D点云处理的本地功能学习

Enhancing Local Feature Learning for 3D Point Cloud Processing using Unary-Pairwise Attention

论文作者

Xiu, Haoyi, Liu, Xin, Wang, Weimin, Kim, Kyoung-Sook, Shinohara, Takayuki, Chang, Qiong, Matsuoka, Masashi

论文摘要

我们提出了一个简单但有效的注意力,命名为一开始的注意力(UPA),用于建模3D点云之间的关系。我们的想法是由分析的动机,即在全球运作的标准自我发挥(SA)倾向于为不同的查询位置产生几乎相同的注意力图,从而揭示了共同学习与查询无关和查询依赖性信息的困难。因此,我们重新制定了SA,并提出了与查询无关的(单一)和查询依赖性(成对的)组件,以促进这两个术语的学习。与SA相反,UPA确保通过本地操作来查询依赖性。广泛的实验表明,UPA在各种点云理解任务(包括形状分类,零件分割和场景分割)上始终优于SA。此外,只需将流行的PointNet ++方法与UPA均匀胜过或与最新的基于注意力的方法相提并论。此外,当将UPA集成为组成模块时,UPA系统地增强了标准网络的性能。

We present a simple but effective attention named the unary-pairwise attention (UPA) for modeling the relationship between 3D point clouds. Our idea is motivated by the analysis that the standard self-attention (SA) that operates globally tends to produce almost the same attention maps for different query positions, revealing difficulties for learning query-independent and query-dependent information jointly. Therefore, we reformulate the SA and propose query-independent (Unary) and query-dependent (Pairwise) components to facilitate the learning of both terms. In contrast to the SA, the UPA ensures query dependence via operating locally. Extensive experiments show that the UPA outperforms the SA consistently on various point cloud understanding tasks including shape classification, part segmentation, and scene segmentation. Moreover, simply equipping the popular PointNet++ method with the UPA even outperforms or is on par with the state-of-the-art attention-based approaches. In addition, the UPA systematically boosts the performance of both standard and modern networks when it is integrated into them as a compositional module.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源