论文标题
Robomem:给机器人长期记忆
RoboMem: Giving Long Term Memory to Robots
论文作者
论文摘要
机器人有可能通过为老年人提供医生以及护理人员提供有关人的行为,健康活动及其周围环境的信息来改善老年人的健康监测结果。多年来,在几个月和几年的数据订单上,可以使机器人能够在更长的时间内保存信息,并使用此上下文信息来回答查询。及时处理此大规模传感器数据的时间复杂性,无法提前预测未来的查询,而在结果中涉及的不精确是在这一领域取得进展的主要障碍。我们通过介绍Robomem来做出贡献,Robomem是一种长期来的询问,以寻求老年人的医疗保健援助系统;连续数据为要克服向机器人提供长期记忆的挑战。我们框架的设计预处理传感器数据并将此预处理数据存储到数据库中。通过连续的细化,提高了对查询的响应的准确性,可以在数据库中更新该数据。如果数据库中的数据不足以回答查询,则将重新处理一小部分相关框架(也从数据库中获得)以获取答案。 [我们的最初的Robomem原型在数据库中存储了3.5MB的数据,而实际的视频帧的535.8MB,并且数据库中的数据最少,它可以提供基本的信息,以平均在0.0002秒内对查询进行响应]。
Robots have the potential to improve health monitoring outcomes for the elderly by providing doctors, and caregivers with information about the person's behavior, health activities and their surrounding environment. Over the years, less work has been done to enable robots to preserve information for longer periods of time, on the order of months and years of data, and use this contextual information to answer queries. Time complexity to process this massive sensor data in a timely fashion, inability to anticipate the future queries in advance and imprecision involved in the results have been the main impediments in making progress in this area. We make a contribution by introducing RoboMem, a query answering system for health-care assistance of elderly over long term; continuous data feeds that intends to overcome the challenges of giving long term memory to robots. The design for our framework preprocesses the sensor data and stores this preprocessed data into the database. This data is updated in the database by going through successive refinements, improving its accuracy for responding to queries. If data in the database is not enough to answer a query, a small set of relevant frames (also obtained from the database) will be reprocessed to obtain the answer. [Our initial prototype of RoboMem stores 3.5MB of data in the database as compared to 535.8MB of actual video frames and with minimal data in the database it is able to fetch information fundamental to respond to queries in 0.0002 seconds on average].