Estimation And Inference For Convex Functions And Computational Efficiency In High Dimensional Statistics

Abstract

Optimization and statistics are intrinsically intertwined with each other. Optimization has been the ends of some statistical problems, like estimation and inference for the minimizer and the minimum of convex functions, and the means for other statistical problems, like computational concerns in high dimensional statistics. In this dissertation, we consider both optimization-related problems.Estimation and inference for the minimizer and minimum of convex functions have been longstanding problems with wide application in economics and health care. But existing approaches are insufficient due to their asymptotic nature and/or incapability of characterizing function-specific difficulty. We investigate the problems under non-asymptotic frameworks that characterize function-specific difficulty and propose adaptive computational-efficient optimal methods. The first two parts of the dissertation address these problems, briefly summarized as follows. • The first part focuses on univariate convex functions. We develop computationally efficient adaptive optimal procedures under local minimax framework and discover a novel Uncertainty Principle that provides a fundamental limit on how well the minimizer and minimum can be estimated simultaneously for any convex regression function. • The second part focuses on multivariate additive convex functions. Under function-specific benchmarks, we propose computationally efficient optimal methods and establish their optimality. Computational efficiency is another optimization-related problem of increasingly importance in statistics, especially in the AI age where the scale of data is big and the requirement on computational time is high. To achieve the balance between running time and statistical accuracy, we propose a framework that provides theoretically guaranteed optimization methods together with the analysis of interplay between running time and statistical accuracy for a class of high-dimensional problems in the third part of the dissertation. Our framework consists of three parts, statistical-optimization interplay analysis, which characterizes optimization induced statistical error in a more essential way, optimization template algorithm, and optimization convergence analysis. We showcase the power of our framework through three example problems, where we get novel results for the first two and show that our framework adapts to the degenerate case through the third example

    Similar works