3 research outputs found

    Integrative Multi-View Reduced-Rank Regression: Bridging Group-Sparse and Low-Rank Models

    Full text link
    Multi-view data have been routinely collected in various fields of science and engineering. A general problem is to study the predictive association between multivariate responses and multi-view predictor sets, all of which can be of high dimensionality. It is likely that only a few views are relevant to prediction, and the predictors within each relevant view contribute to the prediction collectively rather than sparsely. We cast this new problem under the familiar multivariate regression framework and propose an integrative reduced-rank regression (iRRR), where each view has its own low-rank coefficient matrix. As such, latent features are extracted from each view in a supervised fashion. For model estimation, we develop a convex composite nuclear norm penalization approach, which admits an efficient algorithm via alternating direction method of multipliers. Extensions to non-Gaussian and incomplete data are discussed. Theoretically, we derive non-asymptotic oracle bounds of iRRR under a restricted eigenvalue condition. Our results recover oracle bounds of several special cases of iRRR including Lasso, group Lasso and nuclear norm penalized regression. Therefore, iRRR seamlessly bridges group-sparse and low-rank methods and can achieve substantially faster convergence rate under realistic settings of multi-view learning. Simulation studies and an application in the Longitudinal Studies of Aging further showcase the efficacy of the proposed methods

    Decomposition into Low-rank plus Additive Matrices for Background/Foreground Separation: A Review for a Comparative Evaluation with a Large-Scale Dataset

    Full text link
    Recent research on problem formulations based on decomposition into low-rank plus sparse matrices shows a suitable framework to separate moving objects from the background. The most representative problem formulation is the Robust Principal Component Analysis (RPCA) solved via Principal Component Pursuit (PCP) which decomposes a data matrix in a low-rank matrix and a sparse matrix. However, similar robust implicit or explicit decompositions can be made in the following problem formulations: Robust Non-negative Matrix Factorization (RNMF), Robust Matrix Completion (RMC), Robust Subspace Recovery (RSR), Robust Subspace Tracking (RST) and Robust Low-Rank Minimization (RLRM). The main goal of these similar problem formulations is to obtain explicitly or implicitly a decomposition into low-rank matrix plus additive matrices. In this context, this work aims to initiate a rigorous and comprehensive review of the similar problem formulations in robust subspace learning and tracking based on decomposition into low-rank plus additive matrices for testing and ranking existing algorithms for background/foreground separation. For this, we first provide a preliminary review of the recent developments in the different problem formulations which allows us to define a unified view that we called Decomposition into Low-rank plus Additive Matrices (DLAM). Then, we examine carefully each method in each robust subspace learning/tracking frameworks with their decomposition, their loss functions, their optimization problem and their solvers. Furthermore, we investigate if incremental algorithms and real-time implementations can be achieved for background/foreground separation. Finally, experimental results on a large-scale dataset called Background Models Challenge (BMC 2012) show the comparative performance of 32 different robust subspace learning/tracking methods.Comment: 121 pages, 5 figures, submitted to Computer Science Review. arXiv admin note: text overlap with arXiv:1312.7167, arXiv:1109.6297, arXiv:1207.3438, arXiv:1105.2126, arXiv:1404.7592, arXiv:1210.0805, arXiv:1403.8067 by other authors, Computer Science Review, November 201

    Integrative Multivariate Learning via Composite Low-Rank Decompositions

    Get PDF
    We develop novel composite low-rank methods to achieve integrative learning in multivariate linear regression, where both the multivariate responses and predictors can be of high dimensionality and in different data forms. We first consider a regression with multi-view feature sets where only a few views are relevant to prediction and the predictors within each relevant view contribute to the prediction collectively rather than sparsely. To tackle this problem, we propose an integrative reduced-rank regression (iRRR) where each view has its own low-rank coefficient matrix, to conduct view selection and within-view latent feature extraction in a supervised fashion. In addition, to assess the significance of each view in iRRR model, we propose a scaled approach for model estimation and develop a hypothesis testing procedure through de-biasing. Next, to facilitate integrative multi-view learning with grouped sub-compositional predictors, we incorporate the view-specific low-rank structure into a newly proposed multivariate log-contrast model to enable sub-composition selection and latent principal compositional factor extraction. Finally, we propose a nested reduced-rank regression (NRRR) approach to relate multivariate functional responses and predictors. The nested low-rank structure is imposed on the functional regression surfaces to simultaneously identify latent principal functional responses/predictors and control the complexity and smoothness of the association between them. Efficient computational algorithms are developed for these methods, and their theoretical properties are investigated. We apply the proposed methods to multiple applications including the longitudinal study of aging, the preterm infant study and the electricity demand prediction
    corecore