60 research outputs found

    Rejoinder to "Support Vector Machines with Applications"

    Full text link
    Rejoinder to ``Support Vector Machines with Applications'' [math.ST/0612817]Comment: Published at http://dx.doi.org/10.1214/088342306000000501 in the Statistical Science (http://www.imstat.org/sts/) by the Institute of Mathematical Statistics (http://www.imstat.org

    On the combination of kernels for support vector classifiers

    Get PDF
    The problem of combining different sources of information arises in several situations, for instance, the classification of data with asymmetric similarity matrices or the construction of an optimal classifier from a collection of kernels. Often, each source of information can be expressed as a kernel (similarity) matrix and, therefore, a collection of kernels is available. In this paper we propose a new class of methods in order to produce, for classification purposes, an unique and optimal kernel. Then, the constructed kernel is used to train a Support Vector Machine (SVM). The key ideas within the kernel construction are two: the quantification, relative to the classification labels, of the difference of information among the kernels; and the extension of the concept of linear combination of kernels to the concept of functional (matrix) combination of kernels. The proposed methods have been successfully evaluated and compared with other powerful classifiers and kernel combination techniques on a variety of artificial and real classification problems

    An augmented lagrangian interior point method using diretions of negative curvature

    Get PDF
    We describe an efficient implementation of an interior-point algorithm for non-convex problems that uses directions of negative curvature. These directions should ensure convergence to second-order KKT points and improve the computational efficiency of the procedure. Some relevant aspects of the implementation are the strategy to combine a direction of negative curvature and a modified Newton direction, and the conditions to ensure feasibility of the iterates with respect to the simple bounds. The use of multivariate barrier and penalty parameters is also discussed, as well as the update rules for these parameters. Finally, numerical results on a set oftest problems are presented

    Academic quality measurement: A multivariate approach

    Get PDF
    This paper applies a new quality measurement methodology to measure the quality of the postgraduate courses. The methodology we propose is the Academic Quality Measurement (AQM). The model is applied to several simulated data sets where we know the true value of the parameters of the model. A nonparametric model, based in Nearest Neighbours combined with Restricted Least Squared methods, is developed in which students evaluate the overall academic programme quality and a set of dimensions or attributes that determine this quality. The database comes from a Spanish Public University post graduate programme. Among the most important conclusion we say the methodology presented in this work has the following advantages: Knowledge of the attribute weights allow the ordering of the attributes according to their relative importance to the student, showing the key factors for improving quality. Student weights can be related to student characteristics to make market segmentation directly linked to quality objectives. The relative strengths and weaknesses of the service (high educations) can be determined by comparing the mean value of the attributes of the service to the values of other companies (Benchmark process or SWOT analysis).Quality Measurement, Postgraduate Programme, Nonparametric Model.

    An augmented Lagrangian interior-point method using directions of negative curvature

    Get PDF
    The original publication is available at www.springerlink.comWe describe an efficient implementation of an interior-point algorithm for non-convex problems that uses directions of negative curvature. These directions should ensure convergence to second-order KKT points and improve the computational efficiency of the procedure. Some relevant aspects of the implementation are the strategy to combine a direction of negative curvature and a modified Newton direction, and the conditions to ensure feasibility of the iterates with respect to the simple bounds. The use of multivariate barrier and penalty parameters is also discussed, as well as the update rules for these parameters.We analyze the convergence of the procedure; both the linesearch and the update rule for the barrier parameter behave appropriately. As the main goal of the paper is the practical usage of negative curvature, a set of numerical results on small test problems is presented. Based on these results, the relevance of using directions of negative curvature is discussed.Research supported by Spanish MEC grant TIC2000-1750-C06-04; Research supported by Spanish MEC grant BEC2000-0167Publicad

    Combining search directions using gradient flows

    Get PDF
    The original publication is available at www.springerlink.comThe efficient combination of directions is a significant problem in line search methods that either use negative curvature, or wish to include additional information such as the gradient or different approximations to the Newton direction. In this paper we describe a new procedure to combine several of these directions within an interior-point primal-dual algorithm. Basically, we combine in an efficient manner a modified Newton direction with the gradient of a merit function and a direction of negative curvature, if it exists.We also show that the procedure is well-defined, and it has reasonable theoretical properties regarding the rate of convergence of the method. We also present numerical results from an implementation of the proposed algorithm on a set of small test problems from the CUTE collection.Research supported by Spanish MEC grants BEC2000-0167 and PB98-0728Publicad

    ON THE COMBINATION OF KERNELS FOR SUPPORT VECTOR CLASSIFIERS

    Get PDF
    The problem of combining different sources of information arises in several situations, for instance, the classification of data with asymmetric similarity matrices or the construction of an optimal classifier from a collection of kernels. Often, each source of information can be expressed as a kernel (similarity) matrix and, therefore, a collection of kernels is available. In this paper we propose a new class of methods in order to produce, for classification purposes, an unique and optimal kernel. Then, the constructed kernel is used to train a Support Vector Machine (SVM). The key ideas within the kernel construction are two: the quantification, relative to the classification labels, of the difference of information among the kernels; and the extension of the concept of linear combination of kernels to the concept of functional (matrix) combination of kernels. The proposed methods have been successfully evaluated and compared with other powerful classifiers and kernel combination techniques on a variety of artificial and real classification problems.

    Combining search directions using gradient flows

    Get PDF
    The efficient combination of directions is a significant problem in line search methods that either use negative curvature. or wish to include additional information such as the gradient or different approximations to the Newton direction. In thls paper we describe a new procedure to combine several of these directions within an interior-point primal-dual algorithm. Basically. we combine in an efficient manner a modified Newton direction with the gradient of a merit function and a direction of negative curvature. is it exists. We also show that the procedure is well-defined. and it has reasonable theoretical properties regarding the convergence of the method. We also present numerical results from an implementation of the proposed algorithm on a set of small test problems from the CUTE collection

    A note on the use of vector barrier parameters for interior-point methods

    Get PDF
    A key feature to ensure desirable convergence properties in an interior point method is the appropriate choice of an updating rule for the barrier parameter. In this work we analyze and describe updating rules based on the use of a vector of barrier parameters. We show that these updating rules are well defined and satisfy sufficient conditions to ensure con vergence to the correct limit points. We also present some numerical results that illustrate the improved performance of these strategies compared to the use of a scalar barrier parameter.Publicad
    • …
    corecore