128 research outputs found
Simple Error Bounds for Regularized Noisy Linear Inverse Problems
Consider estimating a structured signal from linear,
underdetermined and noisy measurements
, via solving a variant of the
lasso algorithm: . Here, is a
convex function aiming to promote the structure of , say
-norm to promote sparsity or nuclear norm to promote low-rankness. We
assume that the entries of are independent and normally
distributed and make no assumptions on the noise vector , other
than it being independent of . Under this generic setup, we derive
a general, non-asymptotic and rather tight upper bound on the -norm of
the estimation error . Our bound is
geometric in nature and obeys a simple formula; the roles of , and
are all captured by a single summary parameter
, termed the Gaussian squared
distance to the scaled subdifferential. We connect our result to the literature
and verify its validity through simulations.Comment: 6pages, 2 figur
Simple Bounds for Noisy Linear Inverse Problems with Exact Side Information
This paper considers the linear inverse problem where we wish to estimate a
structured signal from its corrupted observations. When the problem is
ill-posed, it is natural to make use of a convex function that
exploits the structure of the signal. For example, norm can be used
for sparse signals. To carry out the estimation, we consider two well-known
convex programs: 1) Second order cone program (SOCP), and, 2) Lasso. Assuming
Gaussian measurements, we show that, if precise information about the value
or the -norm of the noise is available, one can do a
particularly good job at estimation. In particular, the reconstruction error
becomes proportional to the "sparsity" of the signal rather than the ambient
dimension of the noise vector. We connect our results to existing works and
provide a discussion on the relation of our results to the standard
least-squares problem. Our error bounds are non-asymptotic and sharp, they
apply to arbitrary convex functions and do not assume any distribution on the
noise.Comment: 13 page
Isotropically Random Orthogonal Matrices: Performance of LASSO and Minimum Conic Singular Values
Recently, the precise performance of the Generalized LASSO algorithm for
recovering structured signals from compressed noisy measurements, obtained via
i.i.d. Gaussian matrices, has been characterized. The analysis is based on a
framework introduced by Stojnic and heavily relies on the use of Gordon's
Gaussian min-max theorem (GMT), a comparison principle on Gaussian processes.
As a result, corresponding characterizations for other ensembles of measurement
matrices have not been developed. In this work, we analyze the corresponding
performance of the ensemble of isotropically random orthogonal (i.r.o.)
measurements. We consider the constrained version of the Generalized LASSO and
derive a sharp characterization of its normalized squared error in the
large-system limit. When compared to its Gaussian counterpart, our result
analytically confirms the superiority in performance of the i.r.o. ensemble.
Our second result, derives an asymptotic lower bound on the minimum conic
singular values of i.r.o. matrices. This bound is larger than the corresponding
bound on Gaussian matrices. To prove our results we express i.r.o. matrices in
terms of Gaussians and show that, with some modifications, the GMT framework is
still applicable
- β¦