3 research outputs found

    Boosting en el modelo de aprendizaje PAC

    Get PDF
    A review on the idea of Boosting in the PAC learning model is presented. Also a review of the first practical Boosting method, the adaptative boosting (Adaboost) is provided, giving details concerning theoretical garantees on error convergence and exploring the important concept of margin.Una revisión de la idea de Boosting en el modelo de aprendizaje PAC es presentada. Adicionalmente se provee una revisión del primer método de Boosting práctico, el Boosting adaptativo (Adaboost), dando detalles respecto a las garantías teóricas en la convergencia del error y explorando el importante concepto de margen

    Maximum Margin Decision Surfaces for Increased Generalisation in Evolutionary Decision Tree Learning

    No full text
    Abstract Decision tree learning is one of the most widely used and practical methods for inductive inference. We present a novel method that increases the generalisation of genetically-induced classification trees, which employ linear discriminants as the partitioning function at each internal node. Genetic Programming is employed to search the space of oblique decision trees. At the end of the evolutionary run, a (1+1) Evolution Strategy is used to geometrically optimise the boundaries in the decision space, which are represented by the linear discriminant functions. The evolutionary optimisation concerns maximising the decision-surface margin that is defined to be the smallest distance between the decision-surface and any of the samples. Initial empirical results of the application of our method to a series of datasets from the UCI repository suggest that model generalisation benefits from the margin maximisation, and that the new method is a very competent approach to pattern classification as compared to other learning algorithms

    Maximum margin decision surfaces for increased generalisation in evolutionary decision tree learning

    No full text
    Paper presented at the 14th European Conference, EuroGP 2011, Torino, Italy, April 27-29, 2011.Decision tree learning is one of the most widely used and practical methods for inductive inference. We present a novel method that increases the generalisation of genetically-induced classification trees, which employ linear discriminants as the partitioning function at each internal node. Genetic Programming is employed to search the space of oblique decision trees. At the end of the evolutionary run, a (1+1) Evolution Strategy is used to geometrically optimise the boundaries in the decision space, which are represented by the linear discriminant functions. The evolutionary optimisation concerns maximising the decision-surface margin that is defined to be the smallest distance between the decision-surface and any of the samples. Initial empirical results of the application of our method to a series of datasets from the UCI repository suggest that model generalisation benefits from the margin maximisation, and that the new method is a very competent approach to pattern classification as compared to other learning algorithms.Science Foundation Irelandti, ke, co, jo, vo, st, en, co, li - TS 02.12 12 month EMBARG
    corecore