7 research outputs found

    Polyharmonic approximation on the sphere

    Full text link
    The purpose of this article is to provide new error estimates for a popular type of SBF approximation on the sphere: approximating by linear combinations of Green's functions of polyharmonic differential operators. We show that the LpL_p approximation order for this kind of approximation is σ\sigma for functions having LpL_p smoothness σ\sigma (for σ\sigma up to the order of the underlying differential operator, just as in univariate spline theory). This is an improvement over previous error estimates, which penalized the approximation order when measuring error in LpL_p, p>2 and held only in a restrictive setting when measuring error in LpL_p, p<2.Comment: 16 pages; revised version; to appear in Constr. Appro

    A unifying framework for vector-valued manifold regularization and multi-view learning

    No full text
    This paper presents a general vector-valued reproducing kernel Hilbert spaces (RKHS) formulation for the problem of learning an unknown functional dependency between a structured input space and a structured output space, in the Semi-Supervised Learning setting. Our formulation includes as special cases Vector-valued Manifold Regularization and Multi-view Learning, thus provides in particular a unifying framework linking these two important learning approaches. In the case of least square loss function, we provide a closed form solution with an efficient implementation. Numerical experiments on challenging multi-class categorization problems show that our multi-view learning formulation achieves results which are comparable with state of the art and are significantly better than single-view learning

    Log-Hilbert-Schmidt metric between positive definite operators on Hilbert spaces

    No full text
    This paper introduces a novel mathematical and computational framework, namely Log-Hilbert-Schmidt metric between positive definite operators on a Hilbert space. This is a generalization of the Log-Euclidean metric on the Rie-mannian manifold of positive definite matrices to the infinite-dimensional setting. The general framework is applied in particular to compute distances between co-variance operators on a Reproducing Kernel Hilbert Space (RKHS), for which we obtain explicit formulas via the corresponding Gram matrices. Empirically, we apply our formulation to the task of multi-category image classification, where each image is represented by an infinite-dimensional RKHS covariance operator. On several challenging datasets, our method significantly outperforms approaches based on covariance matrices computed directly on the original input features, including those using the Log-Euclidean metric, Stein and Jeffreys divergences, achieving new state of the art results

    Kernel-based classification for brain connectivity graphs on the Riemannian manifold of positive definite matrices

    No full text
    An important task in connectomics studies is the classification of connectivity graphs coming from healthy and pathological subjects. In this paper, we propose a mathematical framework based on Riemannian geometry and kernel methods that can be applied to connectivity matrices for the classification task. We tested our approach using different real datasets of functional and structural connectivity, evaluating different metrics to describe the similarity between graphs. The empirical results obtained clearly show the superior performance of our approach compared with baseline methods, demonstrating the advantages of our manifold framework and its potential for other applications
    corecore