论文标题

二重性持续学习

Bilevel Continual Learning

论文作者

Pham, Quang, Sahoo, Doyen, Liu, Chenghao, Hoi, Steven C. H

论文摘要

持续的学习旨在以在线学习方式从一系列任务和数据流中持续学习,能够利用以前所学的知识来改善当前和未来的任务,同时仍然能够在先前的任务上表现良好。许多现有的持续学习方法的一个常见局限性是,由于持续学习的性质,他们经常直接在所有可用的培训数据上训练模型,因此在测试时遭受泛化。在这项工作中,我们通过统一一个{\ it bilevel优化}目标和一个{\ it双重内存管理}策略来构成情节记忆和概括记忆,以实现有效的知识转移并减轻对旧任务的有效知识转移,并同时忘记了旧任务。我们对持续学习基准测试的广泛实验证明了与许多最新方法相比,提出的BCL的功效。我们的实施可从https://github.com/phquang/bilevel-continual-learning获得。

Continual learning aims to learn continuously from a stream of tasks and data in an online-learning fashion, being capable of exploiting what was learned previously to improve current and future tasks while still being able to perform well on the previous tasks. One common limitation of many existing continual learning methods is that they often train a model directly on all available training data without validation due to the nature of continual learning, thus suffering poor generalization at test time. In this work, we present a novel framework of continual learning named "Bilevel Continual Learning" (BCL) by unifying a {\it bilevel optimization} objective and a {\it dual memory management} strategy comprising both episodic memory and generalization memory to achieve effective knowledge transfer to future tasks and alleviate catastrophic forgetting on old tasks simultaneously. Our extensive experiments on continual learning benchmarks demonstrate the efficacy of the proposed BCL compared to many state-of-the-art methods. Our implementation is available at https://github.com/phquang/bilevel-continual-learning.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源