论文标题

使用基于事件的视觉传感器和尖峰神经网络,以生物启发的步态模仿六边形机器人

Bio-inspired Gait Imitation of Hexapod Robot Using Event-Based Vision Sensor and Spiking Neural Network

论文作者

Ting, Justin, Fang, Yan, Lele, Ashwin Sanjay, Raychowdhury, Arijit

论文摘要

对于大多数动物来说,学习走路方法是一项精致的神经系统任务。为了行走,大脑必须综合多个皮质,神经回路和多种感觉输入。一些动物,例如人类,模仿周围的人来加快学习的速度。当人类观看同龄人时,视觉数据将通过大脑的视觉皮层处理。通过中央模式产生(CPG),基于模仿的学习的复杂问题形成了视觉数据和肌肉致动之间的关联。在低功率上重现这种模仿现象,正在学习行走的能源约束机器人仍然具有挑战性且没有探索。我们提出了一种基于神经形态计算和基于事件的愿景的生物启发的馈送方法,以解决步态模仿问题。拟议的方法训练“学生”六角形,通过观看“专家”六角形的腿部移动腿部行走。学生通过单层尖峰神经网络(SNN)处理动态视觉传感器(DVS)数据的流动。学生的SNN在十个迭代的少量收敛时间内成功模仿了专家,并在亚米克罗伊尔级别表现出能效。

Learning how to walk is a sophisticated neurological task for most animals. In order to walk, the brain must synthesize multiple cortices, neural circuits, and diverse sensory inputs. Some animals, like humans, imitate surrounding individuals to speed up their learning. When humans watch their peers, visual data is processed through a visual cortex in the brain. This complex problem of imitation-based learning forms associations between visual data and muscle actuation through Central Pattern Generation (CPG). Reproducing this imitation phenomenon on low power, energy-constrained robots that are learning to walk remains challenging and unexplored. We propose a bio-inspired feed-forward approach based on neuromorphic computing and event-based vision to address the gait imitation problem. The proposed method trains a "student" hexapod to walk by watching an "expert" hexapod moving its legs. The student processes the flow of Dynamic Vision Sensor (DVS) data with a one-layer Spiking Neural Network (SNN). The SNN of the student successfully imitates the expert within a small convergence time of ten iterations and exhibits energy efficiency at the sub-microjoule level.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源