1. 首页
  2. 人工智能
  3. 论文/代码
  4. SCARLET-NAS:在重量共享神经体系结构搜索中弥合稳定性和可伸缩性之间的差距

SCARLET-NAS:在重量共享神经体系结构搜索中弥合稳定性和可伸缩性之间的差距

上传者: 2021-01-23 05:54:48上传 .PDF文件 1.98 MB 热度 13次

发现强大而紧凑的模型是神经体系结构搜索的重要目标。先前的两阶段单发方法受到具有固定深度的搜索空间的限制。..

SCARLET-NAS: Bridging the gap between Stability and Scalability in Weight-sharing Neural Architecture Search

To discover powerful yet compact models is an important goal of neural architecture search. Previous two-stage one-shot approaches are limited by search space with a fixed depth.It seems handy to include an additional skip connection in the search space to make depths variable. However, it creates a large range of perturbation during supernet training and it has difficulty giving a confident ranking for subnetworks. In this paper, we discover that skip connections bring about significant feature inconsistency compared with other operations, which potentially degrades the supernet performance. Based on this observation, we tackle the problem by imposing an equivariant learnable stabilizer to homogenize such disparities. Experiments show that our proposed stabilizer helps to improve the supernet's convergence as well as ranking performance. With an evolutionary search backend that incorporates the stabilized supernet as an evaluator, we derive a family of state-of-the-art architectures, the SCARLET series of several depths, especially SCARLET-A obtains 76.9% top-1 accuracy on ImageNet. The models and evaluation code are released online https://github.com/xiaomi-automl/ScarletNAS.

下载地址
用户评论