20 research outputs found

    Feature Selection for Interpatient Supervised Heart Beat Classification

    Get PDF
    Supervised and interpatient classification of heart beats is primordial in many applications requiring long-term monitoring of the cardiac function. Several classification models able to cope with the strong class unbalance and a large variety of feature sets have been proposed for this task. In practice, over 200 features are often considered, and the features retained in the final model are either chosen using domain knowledge or an exhaustive search in the feature sets without evaluating the relevance of each individual feature included in the classifier. As a consequence, the results obtained by these models can be suboptimal and difficult to interpret. In this work, feature selection techniques are considered to extract optimal feature subsets for state-of-the-art ECG classification models. The performances are evaluated on real ambulatory recordings and compared to previously reported feature choices using the same models. Results indicate that a small number of individual features actually serve the classification and that better performances can be achieved by removing useless features

    Dynamic inferential NO x

    No full text

    A Comparison of Multi-Label Feature Selection Methods Using the Random Forest Paradigm

    No full text
    International audienceIn this paper, we discuss three wrapper multi-label feature selection methods based on the Random Forest paradigm. These variants differ in the way they consider label dependence within the feature selection process. To assess their performance, we conduct an extensive experimental comparison of these strategies against recently proposed approaches using seven benchmark multi-label data sets from different domains. Random Forest handles accurately the feature selection in the multi-label context. Surprisingly, taking into account the dependence between labels in the context of ensemble multi-label feature selection was not found very effective

    Semantic Channel and Shannon’s Channel Mutually Match for Multi-label Classification

    No full text
    Part 2: Machine LearningInternational audienceA semantic channel consists of a set of membership functions or truth functions which indicate the denotations of a set of labels. In the multi-label learning, we obtain a semantic channel from a sampling distribution or Shannon’s channel. If samples are huge, we can directly convert a Shannon’s channel into a semantic channel by the third kind of Bayes’ theorem; otherwise, we can optimize the membership functions by a generalized Kullback–Leibler formula. In the multi-label classification, we partition an instance space with the maximum semantic information criterion, which is a special Regularized Least Squares (RLS) criterion and is equivalent to the maximum likelihood criterion. To simplify the learning, we may only obtain the truth functions of some atomic labels to construct the truth functions of compound labels. In a label’s learning, instances are divided into three kinds (positive, negative, and unclear) instead of two kinds as in the One-vs-Rest or Binary Relevance (BR) method. Every label’s learning is independent as in the BR method. However, it is allowed to train a label without negative examples and a number of binary classifications are not used. In the label selection, for an instance, the classifier selects a compound label with the most semantic information. This classifier has taken into the consideration the correlation between labels already. As a predictive model, the semantic channel does not change with the prior probability distribution (source) of instances. It still works when the source is changed. The classifier will vary with the source and hence can overcome the class-imbalance problem. It is shown that the old population’s increase will change the classifier for label “Old person” and has been impelling the evolution of the semantic meaning of “Old”. The CM iteration algorithm for unseen instance classification is introduced
    corecore