2,045 research outputs found
Optimal AdaBoost Converges
The following work is a preprint collection of formal proofs regarding the
convergence properties of the AdaBoost machine learning algorithm's classifier
and margins. Various math and computer science papers have been written
regarding conjectures and special cases of these convergence properties.
Furthermore, the margins of AdaBoost feature prominently in the research
surrounding the algorithm. At the zenith of this paper we present how
AdaBoost's classifier and margins converge on a value that agrees with decades
of research. After this, we show how various quantities associated with the
combined classifier converge
Exploiting diversity for optimizing margin distribution in ensemble learning
Margin distribution is acknowledged as an important factor for improving the generalization performance of classifiers. In this paper, we propose a novel ensemble learning algorithm named Double Rotation Margin Forest (DRMF), that aims to improve the margin distribution of the combined system over the training set. We utilise random rotation to produce diverse base classifiers, and optimize the margin distribution to exploit the diversity for producing an optimal ensemble. We demonstrate that diverse base classifiers are beneficial in deriving large-margin ensembles, and that therefore our proposed technique will lead to good generalization performance. We examine our method on an extensive set of benchmark classification tasks. The experimental results confirm that DRMF outperforms other classical ensemble algorithms such as Bagging, AdaBoostM1 and Rotation Forest. The success of DRMF is explained from the viewpoints of margin distribution and diversity
- …