1. 首页
  2. 人工智能
  3. 论文/代码
  4. 细粒度的随机建筑搜索

细粒度的随机建筑搜索

上传者: 2021-01-22 05:48:20上传 .PDF文件 1.48 MB 热度 15次

先进的深度网络通常太大,无法部署在移动设备和嵌入式系统上。移动神经体系结构搜索(NAS)方法可自动执行小型模型的设计,但最先进的NAS方法运行成本很高。..

Fine-Grained Stochastic Architecture Search

State-of-the-art deep networks are often too large to deploy on mobile devices and embedded systems. Mobile neural architecture search (NAS) methods automate the design of small models but state-of-the-art NAS methods are expensive to run.Differentiable neural architecture search (DNAS) methods reduce the search cost but explore a limited subspace of candidate architectures. In this paper, we introduce Fine-Grained Stochastic Architecture Search (FiGS), a differentiable search method that searches over a much larger set of candidate architectures. FiGS simultaneously selects and modifies operators in the search space by applying a structured sparse regularization penalty based on the Logistic-Sigmoid distribution. We show results across 3 existing search spaces, matching or outperforming the original search algorithms and producing state-of-the-art parameter-efficient models on ImageNet (e.g., 75.4% top-1 with 2.6M params). Using our architectures as backbones for object detection with SSDLite, we achieve significantly higher mAP on COCO (e.g., 25.8 with 3.0M params) than MobileNetV3 and MnasNet.

用户评论