论文标题

时间感知:关于心理,计算和机器人模型的评论

Time Perception: A Review on Psychological, Computational and Robotic Models

论文作者

Basgol, Hamit, Ayhan, Inci, Ugur, Emre

论文摘要

动物利用时间在世界上生存。高级认知能力(例如计划,决策,沟通和有效合作)需要时间信息。由于时间是认知的不可分割的一部分,因此人们对主观时间的人工智能方法越来越兴趣,这有可能推进该领域。当前的调查研究旨在为研究人员提供有关时间感知的跨学科观点。首先,我们介绍了心理学和神经科学文献的简短背景,涵盖了时间感知和相关能力的特征和模型。其次,我们总结了时间感知的新兴计算和机器人模型。文献的一般概述表明,大量的时序模型基于专用时间处理,例如来自神经网络动力学的时钟样机制的出现,并揭示了实施方案与时间感知之间的关系。我们还注意到,大多数定时模型都是用于感官时机(即评估间隔的能力)或电动机时序(即重现间隔的能力)的开发。能够追溯时间安排的时序模型的数量,即在不关注的情况下跟踪时间的能力,这是不足的。从这个角度来看,我们讨论了在时间感知领域促进跨学科合作的可能的研究方向。

Animals exploit time to survive in the world. Temporal information is required for higher-level cognitive abilities such as planning, decision making, communication, and effective cooperation. Since time is an inseparable part of cognition, there is a growing interest in the artificial intelligence approach to subjective time, which has a possibility of advancing the field. The current survey study aims to provide researchers with an interdisciplinary perspective on time perception. Firstly, we introduce a brief background from the psychology and neuroscience literature, covering the characteristics and models of time perception and related abilities. Secondly, we summarize the emergent computational and robotic models of time perception. A general overview to the literature reveals that a substantial amount of timing models are based on a dedicated time processing like the emergence of a clock-like mechanism from the neural network dynamics and reveal a relationship between the embodiment and time perception. We also notice that most models of timing are developed for either sensory timing (i.e. ability to assess an interval) or motor timing (i.e. ability to reproduce an interval). The number of timing models capable of retrospective timing, which is the ability to track time without paying attention, is insufficient. In this light, we discuss the possible research directions to promote interdisciplinary collaboration in the field of time perception.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源