论文标题

NSGANETV2:进化多目标替代辅助神经体系结构搜索

NSGANetV2: Evolutionary Multi-Objective Surrogate-Assisted Neural Architecture Search

论文作者

Lu, Zhichao, Deb, Kalyanmoy, Goodman, Erik, Banzhaf, Wolfgang, Boddeti, Vishnu Naresh

论文摘要

在本文中,我们提出了一种有效的NAS算法,用于生成在多个竞争目标下具有竞争力的特定于任务模型。它包括两个替代物,一个在建筑水平上,以提高样品效率,另一个通过超网,以提高梯度下降训练效率。在标准基准数据集(C10,C100,Imagenet)上,所得模型,称为NSGANETV2,从现有方法中匹配或胜过模型,搜索范围是更有效的数量级。此外,我们证明了该方法对六个不同标准数据集的拟议方法的有效性和多功能性,例如STL-10,Flowers102,牛津宠物,FGVC飞机等。在所有情况下,NSGANETV2S改善了最新的(在移动设置下),这表明NAS可以在处理各种场景(例如小规模或细粒度数据集)中使用常规转移学习方法的可行替代方法。代码可从https://github.com/mikelzc1990/nsganetv2获得

In this paper, we propose an efficient NAS algorithm for generating task-specific models that are competitive under multiple competing objectives. It comprises of two surrogates, one at the architecture level to improve sample efficiency and one at the weights level, through a supernet, to improve gradient descent training efficiency. On standard benchmark datasets (C10, C100, ImageNet), the resulting models, dubbed NSGANetV2, either match or outperform models from existing approaches with the search being orders of magnitude more sample efficient. Furthermore, we demonstrate the effectiveness and versatility of the proposed method on six diverse non-standard datasets, e.g. STL-10, Flowers102, Oxford Pets, FGVC Aircrafts etc. In all cases, NSGANetV2s improve the state-of-the-art (under mobile setting), suggesting that NAS can be a viable alternative to conventional transfer learning approaches in handling diverse scenarios such as small-scale or fine-grained datasets. Code is available at https://github.com/mikelzc1990/nsganetv2

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源