3,488 research outputs found

    Log-Hilbert-Schmidt metric between positive definite operators on Hilbert spaces

    Get PDF
    Abstract This paper introduces a novel mathematical and computational framework, namely Log-Hilbert-Schmidt metric between positive definite operators on a Hilbert space. This is a generalization of the Log-Euclidean metric on the Riemannian manifold of positive definite matrices to the infinite-dimensional setting. The general framework is applied in particular to compute distances between covariance operators on a Reproducing Kernel Hilbert Space (RKHS), for which we obtain explicit formulas via the corresponding Gram matrices. Empirically, we apply our formulation to the task of multi-category image classification, where each image is represented by an infinite-dimensional RKHS covariance operator. On several challenging datasets, our method significantly outperforms approaches based on covariance matrices computed directly on the original input features, including those using the Log-Euclidean metric, Stein and Jeffreys divergences, achieving new state of the art results

    Non positively curved metric in the space of positive definite infinite matrices

    Get PDF
    We introduce a Riemannian metric with non positive curvature in the (infinite dimensional) manifold Σ∞ of positive invertible operators of a Hilbert space H, which are scalar perturbations of Hilbert-Schmidt operators. The (minimal) geodesics and the geodesic distance are computed. It is shown that this metric, which is complete, generalizes the well known non positive metric for positive definite complex matrices. Moreover, these spaces of finite matrices are naturally imbedded in Σ∞.Fil: Andruchow, Esteban. Consejo Nacional de Investigaciones Científicas y Técnicas. Oficina de Coordinación Administrativa Saavedra 15. Instituto Argentino de Matemática Alberto Calderón; Argentina. Universidad Nacional de General Sarmiento. Instituto de Ciencias; ArgentinaFil: Varela, Alejandro. Universidad Nacional de General Sarmiento. Instituto de Ciencias; Argentina. Consejo Nacional de Investigaciones Científicas y Técnicas. Oficina de Coordinación Administrativa Saavedra 15. Instituto Argentino de Matemática Alberto Calderón; Argentin

    Learning Sets with Separating Kernels

    Full text link
    We consider the problem of learning a set from random samples. We show how relevant geometric and topological properties of a set can be studied analytically using concepts from the theory of reproducing kernel Hilbert spaces. A new kind of reproducing kernel, that we call separating kernel, plays a crucial role in our study and is analyzed in detail. We prove a new analytic characterization of the support of a distribution, that naturally leads to a family of provably consistent regularized learning algorithms and we discuss the stability of these methods with respect to random sampling. Numerical experiments show that the approach is competitive, and often better, than other state of the art techniques.Comment: final versio

    The least squares mean of positive Hilbert-Schmidt operators

    Get PDF
    We show that, the least squares mean on the Riemannian manifold σ of positive operators in the extended Hilbert-Schmidt algebra of linear operators on a Hilbert space equipped with the canonical trace metric is the unique solution of the corresponding Karcher equation. This allows us to conclude that, the least squares mean is the restriction of the Karcher mean on the open cone of all bounded positive definite operators, and hence inherits the basic properties of that mean. Conversely, the Karcher mean on the positive definite operators is shown to be the unique monotonically strongly continuous extension of the least squares mean on σ
    • …
    corecore