Energy-based Out-of-distribution Detection
Energy-based Out-of-distribution Detection
Determining whether inputs are out-of-distribution (OOD) is an essential building block for safely deploying machine learning models in the open world. However, previous methods relying on the softmax confidence score suffer from overconfident posterior distributions for OOD data.We propose a unified framework for OOD detection that uses an energy score. We show that energy scores better distinguish in- and out-of-distribution samples than the traditional approach using the softmax scores. Unlike softmax confidence scores, energy scores are theoretically aligned with the probability density of the inputs and are less susceptible to the overconfidence issue. Within this framework, energy can be flexibly used as a scoring function for any pre-trained neural classifier as well as a trainable cost function to shape the energy surface explicitly for OOD detection. On a CIFAR-10 pre-trained WideResNet, using the energy score reduces the average FPR (at TPR 95%) by 18.03% compared to the softmax confidence score. With energy-based training, our method outperforms the state-of-the-art on common benchmarks.
基于能量的失配检测
确定输入是否为分发外(OOD)是在开放世界中安全部署机器学习模型的重要组成部分。但是,依赖于softmax置信度得分的先前方法存在面向OOD数据的过度自信的后验分布。.. 我们建议使用能量评分的OOD检测统一框架。我们显示,与使用softmax分数的传统方法相比,能量分数能更好地区分分布内样本和分布外样本。与softmax置信度分数不同,能量分数在理论上与输入的概率密度一致,并且不太容易受到过度置信度问题的影响。在此框架内,可以将能量灵活地用作任何预先训练的神经分类器的评分功能,以及可以训练的成本函数,以明确地成形能量表面以进行OOD检测。在CIFAR-10预训练的WideResNet上,与softmax置信度得分相比,使用能量得分可使平均FPR(在TPR 95%时)降低18.03%。通过基于能量的培训,我们的方法在通用基准上优于最新技术。 (阅读更多)