论文标题

R-DFCIL:无数据类增量学习的关系引导的表示学习

R-DFCIL: Relation-Guided Representation Learning for Data-Free Class Incremental Learning

论文作者

Gao, Qiankun, Zhao, Chen, Ghanem, Bernard, Zhang, Jian

论文摘要

在学习新知识时,在灾难性遗忘和无数据CIL(DFCIL)时,班级学习学习(CIL)在不访问先前学过的课程的培训数据的情况下更具挑战性。尽管最近的DFCIL作品介绍了诸如模型反转以合成以前类的数据,但由于合成数据和真实数据之间的严重域间隙,它们无法克服遗忘。为了解决这个问题,本文提出了与R-DFCIL的关系指导的代表学习(RRL)。在RRL中,我们引入了关系知识蒸馏,以灵活地将新数据的结构关系从旧模型转移到当前模型。我们的RRL增强DFCIL可以指导当前的模型,以了解与以前类的表示相兼容的新课程的表示,这大大降低了忘记的同时改善可塑性。为了避免表示和分类器学习之间的相互干扰,我们在RRL期间使用本地分类损失而不是全球分类损失。在RRL之后,分类头将通过全球类平衡的分类损失进行完善,以解决数据不平衡问题,并了解新课程和以前类之间的决策界限。在CIFAR100,Tiny-Imagenet200和Imagenet100上进行的广泛实验表明,我们的R-DFCIL显着超过了以前的方法,并实现了DFCIL的最新性能。代码可从https://github.com/jianzhangcs/r-dfcil获得

Class-Incremental Learning (CIL) struggles with catastrophic forgetting when learning new knowledge, and Data-Free CIL (DFCIL) is even more challenging without access to the training data of previously learned classes. Though recent DFCIL works introduce techniques such as model inversion to synthesize data for previous classes, they fail to overcome forgetting due to the severe domain gap between the synthetic and real data. To address this issue, this paper proposes relation-guided representation learning (RRL) for DFCIL, dubbed R-DFCIL. In RRL, we introduce relational knowledge distillation to flexibly transfer the structural relation of new data from the old model to the current model. Our RRL-boosted DFCIL can guide the current model to learn representations of new classes better compatible with representations of previous classes, which greatly reduces forgetting while improving plasticity. To avoid the mutual interference between representation and classifier learning, we employ local rather than global classification loss during RRL. After RRL, the classification head is refined with global class-balanced classification loss to address the data imbalance issue as well as learn the decision boundaries between new and previous classes. Extensive experiments on CIFAR100, Tiny-ImageNet200, and ImageNet100 demonstrate that our R-DFCIL significantly surpasses previous approaches and achieves a new state-of-the-art performance for DFCIL. Code is available at https://github.com/jianzhangcs/R-DFCIL

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源