1. 首页
  2. 人工智能
  3. 论文/代码
  4. Efficient Scale-Permuted Backbone with Learned Resource Distribution

Efficient Scale-Permuted Backbone with Learned Resource Distribution

上传者: 2021-01-24 07:40:38上传 .PDF文件 489.71 KB 热度 12次

Efficient Scale-Permuted Backbone with Learned Resource Distribution

Recently, SpineNet has demonstrated promising results on object detection and image classification over ResNet model. However, it is unclear if the improvement adds up when combining scale-permuted backbone with advanced efficient operations and compound scaling.Furthermore, SpineNet is built with a uniform resource distribution over operations. While this strategy seems to be prevalent for scale-decreased models, it may not be an optimal design for scale-permuted models. In this work, we propose a simple technique to combine efficient operations and compound scaling with a previously learned scale-permuted architecture. We demonstrate the efficiency of scale-permuted model can be further improved by learning a resource distribution over the entire network. The resulting efficient scale-permuted models outperform state-of-the-art EfficientNet-based models on object detection and achieve competitive performance on image classification and semantic segmentation. Code and models will be open-sourced soon.

具有学习的资源分配的高效规模置换骨干

最近,SpineNet在ResNet模型上的目标检测和图像分类中展示了令人鼓舞的结果。但是,尚不清楚将比例排列的主干与先进的有效操作和复合比例缩放相结合时,改进是否会累加。.. 此外,SpineNet的构建具有统一的操作资源分配。尽管此策略似乎在比例减小的模型中盛行,但对于比例变换的模型而言可能不是最佳设计。在这项工作中,我们提出了一种简单的技术,可以将有效的运算和复合缩放与以前学习的缩放排列架构相结合。我们证明通过学习整个网络上的资源分布,可以进一步提高比例排列模型的效率。由此产生的高效的比例排列模型在对象检测方面优于基于EfficientNet的最新模型,并在图像分类和语义分割方面取得了竞争优势。代码和模型将很快开源。 (阅读更多)

用户评论