30 research outputs found
LASSO reloaded: a variational analysis perspective with applications to compressed sensing
This paper provides a variational analysis of the unconstrained formulation
of the LASSO problem, ubiquitous in statistical learning, signal processing,
and inverse problems. In particular, we establish smoothness results for the
optimal value as well as Lipschitz properties of the optimal solution as
functions of the right-hand side (or measurement vector) and the regularization
parameter. Moreover, we show how to apply the proposed variational analysis to
study the sensitivity of the optimal solution to the tuning parameter in the
context of compressed sensing with subgaussian measurements. Our theoretical
findings are validated by numerical experiments
Square Root {LASSO}: well-posedness, Lipschitz stability and the tuning trade off
This paper studies well-posedness and parameter sensitivity of the Square
Root LASSO (SR-LASSO), an optimization model for recovering sparse solutions to
linear inverse problems in finite dimension. An advantage of the SR-LASSO
(e.g., over the standard LASSO) is that the optimal tuning of the
regularization parameter is robust with respect to measurement noise. This
paper provides three point-based regularity conditions at a solution of the
SR-LASSO: the weak, intermediate, and strong assumptions. It is shown that the
weak assumption implies uniqueness of the solution in question. The
intermediate assumption yields a directionally differentiable and locally
Lipschitz solution map (with explicit Lipschitz bounds), whereas the strong
assumption gives continuous differentiability of said map around the point in
question. Our analysis leads to new theoretical insights on the comparison
between SR-LASSO and LASSO from the viewpoint of tuning parameter sensitivity:
noise-robust optimal parameter choice for SR-LASSO comes at the "price" of
elevated tuning parameter sensitivity. Numerical results support and showcase
the theoretical findings