33,682 research outputs found
Evolving Ensemble Fuzzy Classifier
The concept of ensemble learning offers a promising avenue in learning from
data streams under complex environments because it addresses the bias and
variance dilemma better than its single model counterpart and features a
reconfigurable structure, which is well suited to the given context. While
various extensions of ensemble learning for mining non-stationary data streams
can be found in the literature, most of them are crafted under a static base
classifier and revisits preceding samples in the sliding window for a
retraining step. This feature causes computationally prohibitive complexity and
is not flexible enough to cope with rapidly changing environments. Their
complexities are often demanding because it involves a large collection of
offline classifiers due to the absence of structural complexities reduction
mechanisms and lack of an online feature selection mechanism. A novel evolving
ensemble classifier, namely Parsimonious Ensemble pENsemble, is proposed in
this paper. pENsemble differs from existing architectures in the fact that it
is built upon an evolving classifier from data streams, termed Parsimonious
Classifier pClass. pENsemble is equipped by an ensemble pruning mechanism,
which estimates a localized generalization error of a base classifier. A
dynamic online feature selection scenario is integrated into the pENsemble.
This method allows for dynamic selection and deselection of input features on
the fly. pENsemble adopts a dynamic ensemble structure to output a final
classification decision where it features a novel drift detection scenario to
grow the ensemble structure. The efficacy of the pENsemble has been numerically
demonstrated through rigorous numerical studies with dynamic and evolving data
streams where it delivers the most encouraging performance in attaining a
tradeoff between accuracy and complexity.Comment: this paper has been published by IEEE Transactions on Fuzzy System
Energy-efficient Amortized Inference with Cascaded Deep Classifiers
Deep neural networks have been remarkable successful in various AI tasks but
often cast high computation and energy cost for energy-constrained applications
such as mobile sensing. We address this problem by proposing a novel framework
that optimizes the prediction accuracy and energy cost simultaneously, thus
enabling effective cost-accuracy trade-off at test time. In our framework, each
data instance is pushed into a cascade of deep neural networks with increasing
sizes, and a selection module is used to sequentially determine when a
sufficiently accurate classifier can be used for this data instance. The
cascade of neural networks and the selection module are jointly trained in an
end-to-end fashion by the REINFORCE algorithm to optimize a trade-off between
the computational cost and the predictive accuracy. Our method is able to
simultaneously improve the accuracy and efficiency by learning to assign easy
instances to fast yet sufficiently accurate classifiers to save computation and
energy cost, while assigning harder instances to deeper and more powerful
classifiers to ensure satisfiable accuracy. With extensive experiments on
several image classification datasets using cascaded ResNet classifiers, we
demonstrate that our method outperforms the standard well-trained ResNets in
accuracy but only requires less than 20% and 50% FLOPs cost on the CIFAR-10/100
datasets and 66% on the ImageNet dataset, respectively
An Overview of Classifier Fusion Methods
A number of classifier fusion methods have been
recently developed opening an alternative approach
leading to a potential improvement in the
classification performance. As there is little theory of
information fusion itself, currently we are faced with
different methods designed for different problems and
producing different results. This paper gives an
overview of classifier fusion methods and attempts to
identify new trends that may dominate this area of
research in future. A taxonomy of fusion methods
trying to bring some order into the existing “pudding
of diversities” is also provided
An Overview of Classifier Fusion Methods
A number of classifier fusion methods have been
recently developed opening an alternative approach
leading to a potential improvement in the
classification performance. As there is little theory of
information fusion itself, currently we are faced with
different methods designed for different problems and
producing different results. This paper gives an
overview of classifier fusion methods and attempts to
identify new trends that may dominate this area of
research in future. A taxonomy of fusion methods
trying to bring some order into the existing “pudding
of diversities” is also provided
- …