1. 首页
  2. 人工智能
  3. 论文/代码
  4. DyNet:用于加速卷积神经网络的动态卷积

DyNet:用于加速卷积神经网络的动态卷积

上传者: 2021-01-22 14:25:29上传 .PDF文件 1.13 MB 热度 36次

卷积算子是卷积神经网络(CNN)的核心,并且占据最大的计算成本。为了使CNN更加有效,已经提出了许多方法来设计轻量级网络或压缩模型。..

DyNet: Dynamic Convolution for Accelerating Convolutional Neural Networks

Convolution operator is the core of convolutional neural networks (CNNs) and occupies the most computation cost. To make CNNs more efficient, many methods have been proposed to either design lightweight networks or compress models.Although some efficient network structures have been proposed, such as MobileNet or ShuffleNet, we find that there still exists redundant information between convolution kernels. To address this issue, we propose a novel dynamic convolution method to adaptively generate convolution kernels based on image contents. To demonstrate the effectiveness, we apply dynamic convolution on multiple state-of-the-art CNNs. On one hand, we can reduce the computation cost remarkably while maintaining the performance. For ShuffleNetV2/MobileNetV2/ResNet18/ResNet50, DyNet can reduce 37.0/54.7/67.2/71.3% FLOPs without loss of accuracy. On the other hand, the performance can be largely boosted if the computation cost is maintained. Based on the architecture MobileNetV3-Small/Large, DyNet achieves 70.3/77.1% Top-1 accuracy on ImageNet with an improvement of 2.9/1.9%. To verify the scalability, we also apply DyNet on segmentation task, the results show that DyNet can reduce 69.3% FLOPs while maintaining Mean IoU on segmentation task.

下载地址
用户评论