27,961 research outputs found

    Qualitative Robustness of Support Vector Machines

    Get PDF
    Support vector machines have attracted much attention in theoretical and in applied statistics. Main topics of recent interest are consistency, learning rates and robustness. In this article, it is shown that support vector machines are qualitatively robust. Since support vector machines can be represented by a functional on the set of all probability measures, qualitative robustness is proven by showing that this functional is continuous with respect to the topology generated by weak convergence of probability measures. Combined with the existence and uniqueness of support vector machines, our results show that support vector machines are the solutions of a well-posed mathematical problem in Hadamard's sense

    A robust morphological classification of high-redshift galaxies using support vector machines on seeing limited images. I Method description

    Full text link
    We present a new non-parametric method to quantify morphologies of galaxies based on a particular family of learning machines called support vector machines. The method, that can be seen as a generalization of the classical CAS classification but with an unlimited number of dimensions and non-linear boundaries between decision regions, is fully automated and thus particularly well adapted to large cosmological surveys. The source code is available for download at http://www.lesia.obspm.fr/~huertas/galsvm.html To test the method, we use a seeing limited near-infrared (KsK_s band, 2,16μm2,16\mu m) sample observed with WIRCam at CFHT at a median redshift of z0.8z\sim0.8. The machine is trained with a simulated sample built from a local visually classified sample from the SDSS chosen in the high-redshift sample's rest-frame (i band, 0.77μm0.77\mu m) and artificially redshifted to match the observing conditions. We use a 12-dimensional volume, including 5 morphological parameters and other caracteristics of galaxies such as luminosity and redshift. We show that a qualitative separation in two main morphological types (late type and early type) can be obtained with an error lower than 20% up to the completeness limit of the sample (KAB22KAB\sim 22) which is more than 2 times better that what would be obtained with a classical C/A classification on the same sample and indeed comparable to space data. The method is optimized to solve a specific problem, offering an objective and automated estimate of errors that enables a straightforward comparison with other surveys.Comment: 11 pages, 7 figures, 3 tables. Submitted to A&A. High resolution images are available on reques

    Qualitative Robustness in Bayesian Inference

    Get PDF
    The practical implementation of Bayesian inference requires numerical approximation when closed-form expressions are not available. What types of accuracy (convergence) of the numerical approximations guarantee robustness and what types do not? In particular, is the recursive application of Bayes' rule robust when subsequent data or posteriors are approximated? When the prior is the push forward of a distribution by the map induced by the solution of a PDE, in which norm should that solution be approximated? Motivated by such questions, we investigate the sensitivity of the distribution of posterior distributions (i.e. posterior distribution-valued random variables, randomized through the data) with respect to perturbations of the prior and data generating distributions in the limit when the number of data points grows towards infinity
    corecore