Multi-Representation Ensemble in Few-Shot Learning
Multi-Representation Ensemble in Few-Shot Learning
Deep neural networks (DNNs) compute representations in a layer by layer fashion, producing a final representation at the top layer of the pipeline, and classification or regression is made using the final representation. A number of DNNs (e.g., ResNet, DenseNet) have shown that representations from the earlier layers can be beneficial.They improved performance by aggregating representations from different layers. In this work, we asked the question, besides forming an aggregation, whether these representations can be utilized directly with the classification layer(s) to obtain better performance. We started our quest to the answer by investigating the classifiers based on the representations from different layers and observed that these classifiers were diverse and many of their decisions were complementary to each other, hence having the potential to generate a better overall decision when combined. Following this observation, we propose an ensemble method that creates an ensemble of classifiers, each taking a representation from a different depth of a base DNN as the input. We tested this ensemble method in the setting of few-shot learning. Experiments were conducted on the mini-ImageNet and tieredImageNet datasets which are commonly used in the evaluation of few-shot learning methods. Our ensemble achieves the new state-of-the-art results for both datasets, comparing to previous regular and ensemble approaches.
少量学习中的多表示集合
深度神经网络(DNN)逐层计算表示形式,在管道的顶层生成最终表示形式,并使用最终表示形式进行分类或回归。许多DNN(例如ResNet,DenseNet)已经表明,来自较早层的表示可能是有益的。.. 他们通过汇总不同层的表示来提高性能。在这项工作中,我们提出了一个问题,除了形成一个聚合之外,这些表示是否可以直接与分类层一起使用以获得更好的性能。我们通过基于不同层次的表示来调查分类器来开始寻找答案,并观察到这些分类器是多种多样的,并且它们的许多决策是相互补充的,因此在组合时有可能产生更好的总体决策。根据这一观察结果,我们提出一种集成方法,该方法创建分类器的集合,每个分类器都使用来自基本DNN不同深度的表示作为输入。我们在一次性学习的环境中测试了这种集成方法。对微型ImageNet和tieredImageNet数据集进行了实验,这些数据集通常用于评估一次性学习方法。与以前的常规方法和集成方法相比,我们的集成件为这两个数据集均实现了最新的最新结果。 (阅读更多)