2 research outputs found

    On Suboptimality of Least Squares with Application to Estimation of Convex Bodies

    Full text link
    We develop a technique for establishing lower bounds on the sample complexity of Least Squares (or, Empirical Risk Minimization) for large classes of functions. As an application, we settle an open problem regarding optimality of Least Squares in estimating a convex set from noisy support function measurements in dimension dβ‰₯6d\geq 6. Specifically, we establish that Least Squares is mimimax sub-optimal, and achieves a rate of Θ~d(nβˆ’2/(dβˆ’1))\tilde{\Theta}_d(n^{-2/(d-1)}) whereas the minimax rate is Θd(nβˆ’4/(d+3))\Theta_d(n^{-4/(d+3)}).Comment: To appaer in Conference on Learning Theory (COLT) 202

    Convex Regression in Multidimensions: Suboptimality of Least Squares Estimators

    Full text link
    The least squares estimator (LSE) is shown to be suboptimal in squared error loss in the usual nonparametric regression model with Gaussian errors for dβ‰₯5d \geq 5 for each of the following families of functions: (i) convex functions supported on a polytope (in fixed design), (ii) bounded convex functions supported on a polytope (in random design), and (iii) convex Lipschitz functions supported on any convex domain (in random design). For each of these families, the risk of the LSE is proved to be of the order nβˆ’2/dn^{-2/d} (up to logarithmic factors) while the minimax risk is nβˆ’4/(d+4)n^{-4/(d+4)}, for dβ‰₯5d \ge 5. In addition, the first rate of convergence results (worst case and adaptive) for the full convex LSE are established for polytopal domains for all dβ‰₯1d \geq 1. Some new metric entropy results for convex functions are also proved which are of independent interest
    corecore