3,278 research outputs found

    Comparative study of RPSALG algorithm for convex semi-infinite programming

    Get PDF
    The Remez penalty and smoothing algorithm (RPSALG) is a unified framework for penalty and smoothing methods for solving min-max convex semi-infinite programing problems, whose convergence was analyzed in a previous paper of three of the authors. In this paper we consider a partial implementation of RPSALG for solving ordinary convex semi-infinite programming problems. Each iteration of RPSALG involves two types of auxiliary optimization problems: the first one consists of obtaining an approximate solution of some discretized convex problem, while the second one requires to solve a non-convex optimization problem involving the parametric constraints as objective function with the parameter as variable. In this paper we tackle the latter problem with a variant of the cutting angle method called ECAM, a global optimization procedure for solving Lipschitz programming problems. We implement different variants of RPSALG which are compared with the unique publicly available SIP solver, NSIPS, on a battery of test problems.This research was partially supported by MINECO of Spain, Grants MTM2011-29064-C03-01/02

    Acceleration of the PDHGM on strongly convex subspaces

    Get PDF
    We propose several variants of the primal-dual method due to Chambolle and Pock. Without requiring full strong convexity of the objective functions, our methods are accelerated on subspaces with strong convexity. This yields mixed rates, O(1/N2)O(1/N^2) with respect to initialisation and O(1/N)O(1/N) with respect to the dual sequence, and the residual part of the primal sequence. We demonstrate the efficacy of the proposed methods on image processing problems lacking strong convexity, such as total generalised variation denoising and total variation deblurring

    A distributionally robust index tracking model with the CVaR penalty: tractable reformulation

    Full text link
    We propose a distributionally robust index tracking model with the conditional value-at-risk (CVaR) penalty. The model combines the idea of distributionally robust optimization for data uncertainty and the CVaR penalty to avoid large tracking errors. The probability ambiguity is described through a confidence region based on the first-order and second-order moments of the random vector involved. We reformulate the model in the form of a min-max-min optimization into an equivalent nonsmooth minimization problem. We further give an approximate discretization scheme of the possible continuous random vector of the nonsmooth minimization problem, whose objective function involves the maximum of numerous but finite nonsmooth functions. The convergence of the discretization scheme to the equivalent nonsmooth reformulation is shown under mild conditions. A smoothing projected gradient (SPG) method is employed to solve the discretization scheme. Any accumulation point is shown to be a global minimizer of the discretization scheme. Numerical results on the NASDAQ index dataset from January 2008 to July 2023 demonstrate the effectiveness of our proposed model and the efficiency of the SPG method, compared with several state-of-the-art models and corresponding methods for solving them

    On Solving Large-Scale Finite Minimax Problems using Exponential Smoothing

    Get PDF
    Journal of Optimization Theory and Applications, Vol. 148, No. 2, pp. 390-421
    corecore