research

Bagging and boosting classification trees to predict churn.

Abstract

In this paper, bagging and boosting techniques are proposed as performing tools for churn prediction. These methods consist of sequentially applying a classification algorithm to resampled or reweigthed versions of the data set. We apply these algorithms on a customer database of an anonymous U.S. wireless telecom company. Bagging is easy to put in practice and, as well as boosting, leads to a significant increase of the classification performance when applied to the customer database. Furthermore, we compare bagged and boosted classifiers computed, respectively, from a balanced versus a proportional sample to predict a rare event (here, churn), and propose a simple correction method for classifiers constructed from balanced training samples.Algorithms; Bagging; Boosting; Churn; Classification; Classifiers; Companies; Data; Gini coefficient; Methods; Performance; Rare events; Sampling; Top decile; Training;

    Similar works