6 research outputs found

    Regression on fixed-rank positive semidefinite matrices: a Riemannian approach

    Full text link
    The paper addresses the problem of learning a regression model parameterized by a fixed-rank positive semidefinite matrix. The focus is on the nonlinear nature of the search space and on scalability to high-dimensional problems. The mathematical developments rely on the theory of gradient descent algorithms adapted to the Riemannian geometry that underlies the set of fixed-rank positive semidefinite matrices. In contrast with previous contributions in the literature, no restrictions are imposed on the range space of the learned matrix. The resulting algorithms maintain a linear complexity in the problem size and enjoy important invariance properties. We apply the proposed algorithms to the problem of learning a distance function parameterized by a positive semidefinite matrix. Good performance is observed on classical benchmarks

    PETRELS: Parallel Subspace Estimation and Tracking by Recursive Least Squares from Partial Observations

    Full text link
    Many real world data sets exhibit an embedding of low-dimensional structure in a high-dimensional manifold. Examples include images, videos and internet traffic data. It is of great significance to reduce the storage requirements and computational complexity when the data dimension is high. Therefore we consider the problem of reconstructing a data stream from a small subset of its entries, where the data is assumed to lie in a low-dimensional linear subspace, possibly corrupted by noise. We further consider tracking the change of the underlying subspace, which can be applied to applications such as video denoising, network monitoring and anomaly detection. Our problem can be viewed as a sequential low-rank matrix completion problem in which the subspace is learned in an on-line fashion. The proposed algorithm, dubbed Parallel Estimation and Tracking by REcursive Least Squares (PETRELS), first identifies the underlying low-dimensional subspace via a recursive procedure for each row of the subspace matrix in parallel with discounting for previous observations, and then reconstructs the missing entries via least-squares estimation if required. Numerical examples are provided for direction-of-arrival estimation and matrix completion, comparing PETRELS with state of the art batch algorithms.Comment: submitted to IEEE Trans. Signal Processing. Part of the result was reported at ICASSP 2012 and won the best student paper awar

    Robust low-dimensional structure learning for big data and its applications

    Get PDF
    Ph.DDOCTOR OF PHILOSOPH

    Online Tracking of Linear Subspaces

    No full text
    corecore