We study a sample complexity vs. conditioning tradeoff in modern signal
recovery problems where convex optimization problems are built from sampled
observations. We begin by introducing a set of condition numbers related to
sharpness in βpβ or Schatten-p norms (pβ[1,2]) based on nonsmooth
reformulations of a class of convex optimization problems, including sparse
recovery, low-rank matrix sensing, covariance estimation, and (abstract) phase
retrieval. In each of the recovery tasks, we show that the condition numbers
become dimension independent constants once the sample size exceeds some
constant multiple of the recovery threshold. Structurally, this result ensures
that the inaccuracy in the recovered signal due to both observation noise and
optimization error is well-controlled. Algorithmically, such a result ensures
that a new first-order method for solving the class of sharp convex functions
in a given βpβ or Schatten-p norm, when applied to the nonsmooth
formulations, achieves nearly-dimension-independent linear convergence