2 research outputs found
On Suboptimality of Least Squares with Application to Estimation of Convex Bodies
We develop a technique for establishing lower bounds on the sample complexity
of Least Squares (or, Empirical Risk Minimization) for large classes of
functions. As an application, we settle an open problem regarding optimality of
Least Squares in estimating a convex set from noisy support function
measurements in dimension . Specifically, we establish that Least
Squares is mimimax sub-optimal, and achieves a rate of
whereas the minimax rate is
.Comment: To appaer in Conference on Learning Theory (COLT) 202
Convex Regression in Multidimensions: Suboptimality of Least Squares Estimators
The least squares estimator (LSE) is shown to be suboptimal in squared error
loss in the usual nonparametric regression model with Gaussian errors for for each of the following families of functions: (i) convex functions
supported on a polytope (in fixed design), (ii) bounded convex functions
supported on a polytope (in random design), and (iii) convex Lipschitz
functions supported on any convex domain (in random design). For each of these
families, the risk of the LSE is proved to be of the order (up to
logarithmic factors) while the minimax risk is , for .
In addition, the first rate of convergence results (worst case and adaptive)
for the full convex LSE are established for polytopal domains for all . Some new metric entropy results for convex functions are also proved which
are of independent interest