1. 首页
  2. 人工智能
  3. 论文/代码
  4. 预训练神经网络中神经网络连接权重的指数离散化

预训练神经网络中神经网络连接权重的指数离散化

上传者: 2021-01-22 15:31:38上传 .PDF文件 1.59 MB 热度 27次

为了减少随机存取存储器(RAM)的要求并提高识别算法的速度,我们考虑了经过训练的神经网络的权重离散化问题。我们显示出指数离散比线性离散更可取,因为当位数减少1或2时,它允许一个实现相同的精度。..

Exponential discretization of weights of neural network connections in pre-trained neural networks

To reduce random access memory (RAM) requirements and to increase speed of recognition algorithms we consider a weight discretization problem for trained neural networks. We show that an exponential discretization is preferable to a linear discretization since it allows one to achieve the same accuracy when the number of bits is 1 or 2 less.The quality of the neural network VGG-16 is already satisfactory (top5 accuracy 69%) in the case of 3 bit exponential discretization. The ResNet50 neural network shows top5 accuracy 84% at 4 bits. Other neural networks perform fairly well at 5 bits (top5 accuracies of Xception, Inception-v3, and MobileNet-v2 top5 were 87%, 90%, and 77%, respectively). At less number of bits, the accuracy decreases rapidly.

用户评论