2 research outputs found
Precise high-dimensional error analysis of regularized M-estimators
A general approach for estimating an unknown
signal x_0 β R^n from noisy, linear measurements
y = Ax_0 + z β R^m is via solving a so called regularized
M-estimator: x := arg min_x L(y-Ax) + Ξ»f(x). Here, L is a convex loss function, f is a convex (typically, nonsmooth)
regularizer, and, Ξ» > 0 a regularizer parameter.
We analyze the squared error performance ||x-x_0||^2_2
of such estimators in the high-dimensional proportional
regime where m, n β β and m/n β Ξ΄. We let the design
matrix A have entries iid Gaussian, and, impose minimal
and rather mild regularity conditions on the loss function,
on the regularizer, and, on the distributions of the noise
and of the unknown signal. Under such a generic setting,
we show that the squared error converges in probability
to a nontrivial limit that is computed by solving four
nonlinear equations on four scalar unknowns. We identify
a new summary parameter, termed the expected
Moreau envelope, which determines how the choice of
the loss function and of the regularizer affects the error
performance. The result opens the way for answering
optimality questions regarding the choice of the loss
function, the regularizer, the penalty parameter, etc