3,959 research outputs found

    Efficient Decision Trees for Multi-class Support Vector Machines Using Large Centroid Distance Grouping

    Get PDF
    We propose a new technique for support vector machines (SVMs) in tree structures for multiclass classification. For each tree node, we select an appropriate binary classifier using data class centroids and their in-between distances, categorize the training examples into positive and negative groups of classes and train a new classifier. The proposed technique is fast-trained and can classify an output class data with a complexity between O(log2 N) and O(N) where N is the number of classes. The 10-fold cross-validation experimental results show that the performance of our methods is comparable to that of traditional techniques and required less decision times. Our proposed technique is suitable for problems with a large number of classes due to its advantages of requiring less training time and computational complexity

    Error Bounds for Piecewise Smooth and Switching Regression

    Get PDF
    The paper deals with regression problems, in which the nonsmooth target is assumed to switch between different operating modes. Specifically, piecewise smooth (PWS) regression considers target functions switching deterministically via a partition of the input space, while switching regression considers arbitrary switching laws. The paper derives generalization error bounds in these two settings by following the approach based on Rademacher complexities. For PWS regression, our derivation involves a chaining argument and a decomposition of the covering numbers of PWS classes in terms of the ones of their component functions and the capacity of the classifier partitioning the input space. This yields error bounds with a radical dependency on the number of modes. For switching regression, the decomposition can be performed directly at the level of the Rademacher complexities, which yields bounds with a linear dependency on the number of modes. By using once more chaining and a decomposition at the level of covering numbers, we show how to recover a radical dependency. Examples of applications are given in particular for PWS and swichting regression with linear and kernel-based component functions.Comment: This work has been submitted to the IEEE for possible publication. Copyright may be transferred without notice,after which this version may no longer be accessibl
    corecore