1. 首页
  2. 人工智能
  3. 论文/代码
  4. Know Where To Drop Your Weights: Towards Faster Uncertainty Estimation

Know Where To Drop Your Weights: Towards Faster Uncertainty Estimation

上传者: 2021-01-24 07:28:37上传 .PDF文件 760.21 KB 热度 11次

Know Where To Drop Your Weights: Towards Faster Uncertainty Estimation

Estimating epistemic uncertainty of models used in low-latency applications and Out-Of-Distribution samples detection is a challenge due to the computationally demanding nature of uncertainty estimation techniques. Estimating model uncertainty using approximation techniques like Monte Carlo Dropout (MCD), DropConnect (MCDC) requires a large number of forward passes through the network, rendering them inapt for low-latency applications.We propose Select-DC which uses a subset of layers in a neural network to model epistemic uncertainty with MCDC. Through our experiments, we show a significant reduction in the GFLOPS required to model uncertainty, compared to Monte Carlo DropConnect, with marginal trade-off in performance. We perform a suite of experiments on CIFAR 10, CIFAR 100, and SVHN datasets with ResNet and VGG models. We further show how applying DropConnect to various layers in the network with different drop probabilities affects the networks performance and the entropy of the predictive distribution.

知道在哪里减肥:朝着更快的不确定性估计

由于不确定性估计技术对计算的要求很高,因此估计用于低延迟应用程序和分布外样本检测的模型的认知不确定性是一个挑战。使用诸如蒙特卡洛Dropout(MCD),DropConnect(MCDC)的近似技术来估计模型不确定性,需要通过网络的大量前向传递,从而使其不适用于低延迟应用。.. 我们提出了Select-DC,它使用神经网络中层的子集来模拟MCDC的认知不确定性。通过我们的实验,与蒙特卡洛DropConnect相比,我们显示出建模不确定性所需的GFLOPS有了显着降低,而性能却有所下降。我们使用ResNet和VGG模型对CIFAR 10,CIFAR 100和SVHN数据集进行了一系列实验。我们进一步展示了将DropConnect应用于具有不同丢弃概率的网络中的各个层如何影响网络性能和预测分布的熵。 (阅读更多)

用户评论