2,280 research outputs found

    Ensemble of Example-Dependent Cost-Sensitive Decision Trees

    Get PDF
    Several real-world classification problems are example-dependent cost-sensitive in nature, where the costs due to misclassification vary between examples and not only within classes. However, standard classification methods do not take these costs into account, and assume a constant cost of misclassification errors. In previous works, some methods that take into account the financial costs into the training of different algorithms have been proposed, with the example-dependent cost-sensitive decision tree algorithm being the one that gives the highest savings. In this paper we propose a new framework of ensembles of example-dependent cost-sensitive decision-trees. The framework consists in creating different example-dependent cost-sensitive decision trees on random subsamples of the training set, and then combining them using three different combination approaches. Moreover, we propose two new cost-sensitive combination approaches; cost-sensitive weighted voting and cost-sensitive stacking, the latter being based on the cost-sensitive logistic regression method. Finally, using five different databases, from four real-world applications: credit card fraud detection, churn modeling, credit scoring and direct marketing, we evaluate the proposed method against state-of-the-art example-dependent cost-sensitive techniques, namely, cost-proportionate sampling, Bayes minimum risk and cost-sensitive decision trees. The results show that the proposed algorithms have better results for all databases, in the sense of higher savings.Comment: 13 pages, 6 figures, Submitted for possible publicatio

    Driver Distraction Identification with an Ensemble of Convolutional Neural Networks

    Full text link
    The World Health Organization (WHO) reported 1.25 million deaths yearly due to road traffic accidents worldwide and the number has been continuously increasing over the last few years. Nearly fifth of these accidents are caused by distracted drivers. Existing work of distracted driver detection is concerned with a small set of distractions (mostly, cell phone usage). Unreliable ad-hoc methods are often used.In this paper, we present the first publicly available dataset for driver distraction identification with more distraction postures than existing alternatives. In addition, we propose a reliable deep learning-based solution that achieves a 90% accuracy. The system consists of a genetically-weighted ensemble of convolutional neural networks, we show that a weighted ensemble of classifiers using a genetic algorithm yields in a better classification confidence. We also study the effect of different visual elements in distraction detection by means of face and hand localizations, and skin segmentation. Finally, we present a thinned version of our ensemble that could achieve 84.64% classification accuracy and operate in a real-time environment.Comment: arXiv admin note: substantial text overlap with arXiv:1706.0949

    An improved multiple classifier combination scheme for pattern classification

    Get PDF
    Combining multiple classifiers are considered as a new direction in the pattern recognition to improve classification performance. The main problem of multiple classifier combination is that there is no standard guideline for constructing an accurate and diverse classifier ensemble. This is due to the difficulty in identifying the number of homogeneous classifiers and how to combine the classifier outputs. The most commonly used ensemble method is the random strategy while the majority voting technique is used as the combiner. However, the random strategy cannot determine the number of classifiers and the majority voting technique does not consider the strength of each classifier, thus resulting in low classification accuracy. In this study, an improved multiple classifier combination scheme is proposed. The ant system (AS) algorithm is used to partition feature set in developing feature subsets which represent the number of classifiers. A compactness measure is introduced as a parameter in constructing an accurate and diverse classifier ensemble. A weighted voting technique is used to combine the classifier outputs by considering the strength of the classifiers prior to voting. Experiments were performed using four base classifiers, which are Nearest Mean Classifier (NMC), Naive Bayes Classifier (NBC), k-Nearest Neighbour (k-NN) and Linear Discriminant Analysis (LDA) on benchmark datasets, to test the credibility of the proposed multiple classifier combination scheme. The average classification accuracy of the homogeneous NMC, NBC, k-NN and LDA ensembles are 97.91%, 98.06%, 98.09% and 98.12% respectively. The accuracies are higher than those obtained through the use of other approaches in developing multiple classifier combination. The proposed multiple classifier combination scheme will help to develop other multiple classifier combination for pattern recognition and classification
    • …
    corecore