Article thumbnail

Confidence Region of Singular Subspaces for Low-rank Matrix Regression

By Dong Xia

Abstract

Low-rank matrix regression refers to the instances of recovering a low-rank matrix based on specially designed measurements and the corresponding noisy outcomes. In the last decade, numerous statistical methodologies have been developed for efficiently recovering the unknown low-rank matrices. However, in some applications, the unknown singular subspace is scientifically more important than the low-rank matrix itself. In this article, we revisit the low-rank matrix regression model and introduce a two-step procedure to construct confidence regions of the singular subspace. The procedure involves the de-biasing for the typical low-rank estimators after which we calculate the empirical singular vectors. We investigate the distribution of the joint projection distance between the empirical singular subspace and the unknown true singular subspace. We specifically prove the asymptotical normality of the joint projection distance with data-dependent centering and normalization when $r^{3/2}(m_1+m_2)^{3/2}=o(n/\log n)$ where $m_1, m_2$ denote the matrix row and column sizes, $r$ is the rank and $n$ is the number of independent random measurements. Consequently, we propose data-dependent confidence regions of the true singular subspace which attains any pre-determined confidence level asymptotically. In addition, non-asymptotical convergence rates are also established. Numerical results are presented to demonstrate the merits of our methods.Comment: typos are corrected and motivating examples are adde

Topics: Mathematics - Statistics Theory, Computer Science - Information Theory, Statistics - Methodology, Statistics - Machine Learning
Year: 2019
OAI identifier: oai:arXiv.org:1805.09871

Suggested articles


To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.