论文标题

使用大满贯引用的移动麦克风阵列对声源的3D定位

3D Localization of a Sound Source Using Mobile Microphone Arrays Referenced by SLAM

论文作者

Michaud, Simon, Faucher, Samuel, Grondin, François, Lauzon, Jean-Samuel, Labbé, Mathieu, Létourneau, Dominic, Ferland, François, Michaud, François

论文摘要

麦克风阵列可以为移动机器人提供定位,跟踪和分离2D中远处的声源的能力,即估计其相对高度和方位角。为了将声学数据与现实世界中的视觉信息结合在一起,必须建立空间相关性。本文探讨的方法包括有两个机器人,每个机器人都配备了麦克风阵列,并使用SLAM将自己定位在共享的参考图中。根据其位置,来自麦克风阵列的数据用于在3D中进行三角测量,以相对于同一地图的声源位置。该策略使用移动麦克风阵列进行了一种新型的合作声音映射方法。使用两个移动机器人进行试验,以将静态或移动的声源定位,以检查在哪种情况下进行。结果表明,当两个机器人之间的相对角度在静态声源的30度以上时,观察到低于0.3 m的误差,而在40度和140度之间,使用移动的声源观察到40度至140度的误差。

A microphone array can provide a mobile robot with the capability of localizing, tracking and separating distant sound sources in 2D, i.e., estimating their relative elevation and azimuth. To combine acoustic data with visual information in real world settings, spatial correlation must be established. The approach explored in this paper consists of having two robots, each equipped with a microphone array, localizing themselves in a shared reference map using SLAM. Based on their locations, data from the microphone arrays are used to triangulate in 3D the location of a sound source in relation to the same map. This strategy results in a novel cooperative sound mapping approach using mobile microphone arrays. Trials are conducted using two mobile robots localizing a static or a moving sound source to examine in which conditions this is possible. Results suggest that errors under 0.3 m are observed when the relative angle between the two robots are above 30 degrees for a static sound source, while errors under 0.3 m for angles between 40 degrees and 140 degrees are observed with a moving sound source.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源