论文标题

进行自我监督和举重的神经建筑搜索

Towards Self-supervised and Weight-preserving Neural Architecture Search

论文作者

Li, Zhuowei, Gao, Yibo, Zha, Zhenzhou, HU, Zhiqiang, Xia, Qing, Zhang, Shaoting, Metaxas, Dimitris N.

论文摘要

神经建筑搜索(NAS)算法可以节省人类专家的巨大劳动。最近的进步进一步将计算开销降低到负担得起的水平。但是,由于挑剔的程序和监督的学习范式,将NAS技术部署在现实世界应用中仍然很麻烦。在这项工作中,我们通过允许自我审议并保留在搜索阶段发现的同时权重,提出了自我监管和举重的神经结构搜索(SSWP-NAS)作为当前NAS框架的扩展。因此,我们将NAS的工作流程简化为一个单阶段和无代理程序。实验表明,通过所提出的框架搜索的架构实现了CIFAR-10,CIFAR-100和Imagenet数据集上的最新精度,而无需使用手动标签。此外,我们表明,在初始化中,使用伴随的权重始终优于随机初始化和两阶段权重预训练方法在半监督的学习方案下的明确边缘。代码可在https://github.com/lzvv123456/sswp-nas上公开获得。

Neural architecture search (NAS) algorithms save tremendous labor from human experts. Recent advancements further reduce the computational overhead to an affordable level. However, it is still cumbersome to deploy the NAS techniques in real-world applications due to the fussy procedures and the supervised learning paradigm. In this work, we propose the self-supervised and weight-preserving neural architecture search (SSWP-NAS) as an extension of the current NAS framework by allowing the self-supervision and retaining the concomitant weights discovered during the search stage. As such, we simplify the workflow of NAS to a one-stage and proxy-free procedure. Experiments show that the architectures searched by the proposed framework achieve state-of-the-art accuracy on CIFAR-10, CIFAR-100, and ImageNet datasets without using manual labels. Moreover, we show that employing the concomitant weights as initialization consistently outperforms the random initialization and the two-stage weight pre-training method by a clear margin under semi-supervised learning scenarios. Codes are publicly available at https://github.com/LzVv123456/SSWP-NAS.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源