12,462 research outputs found

    Parametric Local Metric Learning for Nearest Neighbor Classification

    Full text link
    We study the problem of learning local metrics for nearest neighbor classification. Most previous works on local metric learning learn a number of local unrelated metrics. While this "independence" approach delivers an increased flexibility its downside is the considerable risk of overfitting. We present a new parametric local metric learning method in which we learn a smooth metric matrix function over the data manifold. Using an approximation error bound of the metric matrix function we learn local metrics as linear combinations of basis metrics defined on anchor points over different regions of the instance space. We constrain the metric matrix function by imposing on the linear combinations manifold regularization which makes the learned metric matrix function vary smoothly along the geodesics of the data manifold. Our metric learning method has excellent performance both in terms of predictive power and scalability. We experimented with several large-scale classification problems, tens of thousands of instances, and compared it with several state of the art metric learning methods, both global and local, as well as to SVM with automatic kernel selection, all of which it outperforms in a significant manner

    Quantitative results on continuity of the spectral factorization mapping

    Get PDF
    The spectral factorization mapping Fβ†’F+F\to F^+ puts a positive definite integrable matrix function FF having an integrable logarithm of the determinant in correspondence with an outer analytic matrix function F+F^+ such that F=F+(F+)βˆ—F = F^+(F^+)^* almost everywhere. The main question addressed here is to what extent βˆ₯F+βˆ’G+βˆ₯H2\|F^+ - G^+\|_{H_2} is controlled by βˆ₯Fβˆ’Gβˆ₯L1\|F-G\|_{L_1} and βˆ₯log⁑det⁑Fβˆ’log⁑det⁑Gβˆ₯L1\|\log \det F - \log\det G\|_{L_1}.Comment: 22 page
    • …
    corecore