15,031 research outputs found
Structured random measurements in signal processing
Compressed sensing and its extensions have recently triggered interest in
randomized signal acquisition. A key finding is that random measurements
provide sparse signal reconstruction guarantees for efficient and stable
algorithms with a minimal number of samples. While this was first shown for
(unstructured) Gaussian random measurement matrices, applications require
certain structure of the measurements leading to structured random measurement
matrices. Near optimal recovery guarantees for such structured measurements
have been developed over the past years in a variety of contexts. This article
surveys the theory in three scenarios: compressed sensing (sparse recovery),
low rank matrix recovery, and phaseless estimation. The random measurement
matrices to be considered include random partial Fourier matrices, partial
random circulant matrices (subsampled convolutions), matrix completion, and
phase estimation from magnitudes of Fourier type measurements. The article
concludes with a brief discussion of the mathematical techniques for the
analysis of such structured random measurements.Comment: 22 pages, 2 figure
On Stein's Identity and Near-Optimal Estimation in High-dimensional Index Models
We consider estimating the parametric components of semi-parametric multiple
index models in a high-dimensional and non-Gaussian setting. Such models form a
rich class of non-linear models with applications to signal processing, machine
learning and statistics. Our estimators leverage the score function based first
and second-order Stein's identities and do not require the covariates to
satisfy Gaussian or elliptical symmetry assumptions common in the literature.
Moreover, to handle score functions and responses that are heavy-tailed, our
estimators are constructed via carefully thresholding their empirical
counterparts. We show that our estimator achieves near-optimal statistical rate
of convergence in several settings. We supplement our theoretical results via
simulation experiments that confirm the theory
Optimal Rates of Convergence for Noisy Sparse Phase Retrieval via Thresholded Wirtinger Flow
This paper considers the noisy sparse phase retrieval problem: recovering a
sparse signal from noisy quadratic measurements , , with independent sub-exponential
noise . The goals are to understand the effect of the sparsity of
on the estimation precision and to construct a computationally feasible
estimator to achieve the optimal rates. Inspired by the Wirtinger Flow [12]
proposed for noiseless and non-sparse phase retrieval, a novel thresholded
gradient descent algorithm is proposed and it is shown to adaptively achieve
the minimax optimal rates of convergence over a wide range of sparsity levels
when the 's are independent standard Gaussian random vectors, provided
that the sample size is sufficiently large compared to the sparsity of .Comment: 28 pages, 4 figure
- …