30,865 research outputs found
Layer Ensembles
Deep Ensembles, as a type of Bayesian Neural Networks, can be used to
estimate uncertainty on the prediction of multiple neural networks by
collecting votes from each network and computing the difference in those
predictions. In this paper, we introduce a novel method for uncertainty
estimation called Layer Ensembles that considers a set of independent
categorical distributions for each layer of the network, giving many more
possible samples with overlapped layers, than in the regular Deep Ensembles. We
further introduce Optimized Layer Ensembles with an inference procedure that
reuses common layer outputs, achieving up to 19x speed up and quadratically
reducing memory usage. We also show that Layer Ensembles can be further
improved by ranking samples, resulting in models that require less memory and
time to run while achieving higher uncertainty quality than Deep Ensembles.Comment: 5 pages, 4 figures. This work has been submitted to the IEEE for
possible publication. Copyright may be transferred without notice, after
which this version may no longer be accessibl
Deep Convolutional Neural Network Ensembles using ECOC
Deep neural networks have enhanced the performance of decision making systems
in many applications including image understanding, and further gains can be
achieved by constructing ensembles. However, designing an ensemble of deep
networks is often not very beneficial since the time needed to train the
networks is very high or the performance gain obtained is not very significant.
In this paper, we analyse error correcting output coding (ECOC) framework to be
used as an ensemble technique for deep networks and propose different design
strategies to address the accuracy-complexity trade-off. We carry out an
extensive comparative study between the introduced ECOC designs and the
state-of-the-art ensemble techniques such as ensemble averaging and gradient
boosting decision trees. Furthermore, we propose a combinatory technique which
is shown to achieve the highest classification performance amongst all.Comment: 13 pages double column IEEE transactions styl
Deep Convolutional Neural Network Ensembles Using ECOC
Deep neural networks have enhanced the performance of decision making systems in many applications, including image understanding, and further gains can be achieved by constructing ensembles. However, designing an ensemble of deep networks is often not very beneficial since the time needed to train the networks is generally very high or the performance gain obtained is not very significant. In this paper, we analyse an error correcting output coding (ECOC) framework for constructing ensembles of deep networks and propose different design strategies to address the accuracy-complexity trade-off. We carry out an extensive comparative study between the introduced ECOC designs and the state-of-the-art ensemble techniques such as ensemble averaging and gradient boosting decision trees. Furthermore, we propose a fusion technique, that is shown to achieve the highest classification performance
Hierarchical Pruning of Deep Ensembles with Focal Diversity
Deep neural network ensembles combine the wisdom of multiple deep neural
networks to improve the generalizability and robustness over individual
networks. It has gained increasing popularity to study deep ensemble techniques
in the deep learning community. Some mission-critical applications utilize a
large number of deep neural networks to form deep ensembles to achieve desired
accuracy and resilience, which introduces high time and space costs for
ensemble execution. However, it still remains a critical challenge whether a
small subset of the entire deep ensemble can achieve the same or better
generalizability and how to effectively identify these small deep ensembles for
improving the space and time efficiency of ensemble execution. This paper
presents a novel deep ensemble pruning approach, which can efficiently identify
smaller deep ensembles and provide higher ensemble accuracy than the entire
deep ensemble of a large number of member networks. Our hierarchical ensemble
pruning approach (HQ) leverages three novel ensemble pruning techniques. First,
we show that the focal diversity metrics can accurately capture the
complementary capacity of the member networks of an ensemble, which can guide
ensemble pruning. Second, we design a focal diversity based hierarchical
pruning approach, which will iteratively find high quality deep ensembles with
low cost and high accuracy. Third, we develop a focal diversity consensus
method to integrate multiple focal diversity metrics to refine ensemble pruning
results, where smaller deep ensembles can be effectively identified to offer
high accuracy, high robustness and high efficiency. Evaluated using popular
benchmark datasets, we demonstrate that the proposed hierarchical ensemble
pruning approach can effectively identify high quality deep ensembles with
better generalizability while being more time and space efficient in ensemble
decision making.Comment: To appear on ACM Transactions on Intelligent Systems and Technolog
- …