87,072 research outputs found

    Convergence analysis of generalized iteratively reweighted least squares algorithms on convex function spaces

    Get PDF
    The computation of robust regression estimates often relies on minimization of a convex functional on a convex set. In this paper we discuss a general technique for a large class of convex functionals to compute the minimizers iteratively which is closely related to majorization-minimization algorithms. Our approach is based on a quadratic approximation of the functional to be minimized and includes the iteratively reweighted least squares algorithm as a special case. We prove convergence on convex function spaces for general coercive and convex functionals F and derive geometric convergence in certain unconstrained settings. The algorithm is applied to TV penalized quantile regression and is compared with a step size corrected Newton-Raphson algorithm. It is found that typically in the first steps the iteratively reweighted least squares algorithm performs significantly better, whereas the Newton type method outpaces the former only after many iterations. Finally, in the setting of bivariate regression with unimodality constraints we illustrate how this algorithm allows to utilize highly efficient algorithms for special quadratic programs in more complex settings. --regression analysis,monotone regression,quantile regression,shape constraints,L1 regression,nonparametric regression,total variation semi-norm,reweighted least squares,Fermat's problem,convex approximation,quadratic approximation,pool adjacent violators algorithm

    On Convex Quadratic Approximation

    Get PDF
    In this paper we prove the counterintuitive result that the quadratic least squares approximation of a multivariate convex function in a finite set of points is not necessarily convex, even though it is convex for a univariate convex function. This result has many consequences both for the field of statistics and optimization. We show that convexity can be enforced in the multivariate case by using semidefinite programming techniques.Convex function;least squares;quadratic interpolation;semidefinite program- ming

    Regularization with Approximated L2L^2 Maximum Entropy Method

    Get PDF
    We tackle the inverse problem of reconstructing an unknown finite measure μ\mu from a noisy observation of a generalized moment of μ\mu defined as the integral of a continuous and bounded operator Φ\Phi with respect to μ\mu. When only a quadratic approximation Φm\Phi_m of the operator is known, we introduce the L2L^2 approximate maximum entropy solution as a minimizer of a convex functional subject to a sequence of convex constraints. Under several assumptions on the convex functional, the convergence of the approximate solution is established and rates of convergence are provided.Comment: 16 page

    On Convex Approximation by Quadratic Splines

    Get PDF
    AbstractIn a recent paper by Hu it is proved that for any convex functionfthere is aC1convex quadratic splineswithnknots that approximatesfat the rate ofω3(f, n−1). The knots of the spline are basically equally spaced. In this paper we give a simple construction of such a spline with equally spaced knots
    corecore