559 research outputs found

    H\"older Error Bounds and H\"older Calmness with Applications to Convex Semi-Infinite Optimization

    Get PDF
    Using techniques of variational analysis, necessary and sufficient subdifferential conditions for H\"older error bounds are investigated and some new estimates for the corresponding modulus are obtained. As an application, we consider the setting of convex semi-infinite optimization and give a characterization of the H\"older calmness of the argmin mapping in terms of the level set mapping (with respect to the objective function) and a special supremum function. We also estimate the H\"older calmness modulus of the argmin mapping in the framework of linear programming.Comment: 25 page

    Variational Analysis of Kurdyka-{\L}ojasiewicz Property, Exponent and Modulus

    Full text link
    The Kurdyka-{\L}ojasiewicz (K{\L}) property, exponent and modulus have played a very important role in the study of global convergence and rate of convergence for optimal algorithms. In this paper, at a stationary point of a locally lower semicontinuous function, we obtain complete characterizations of the K{\L} property and the K{\L} modulus via the outer limiting subdifferential of an auxilliary function and a newly-introduced subderivative function respectively. In particular, for a class of prox-regular, twice epi-differentiable and subdifferentially continuous functions, we show that the K{\L} property and the K{\L} modulus can be described by its Moreau envelopes and a quadratic growth condition. We apply the obtained results to establish the K{\L} property with exponent 12\frac12 and to provide calculation of the modulus for a smooth function, the pointwise maximum of finitely many smooth functions and regularized functions respectively. These functions often appear in the modelling of structured optimization problems.Comment: 28 page

    A subgradient method based on gradient sampling for solving convex optimization problems

    Get PDF
    2015-2016 > Academic research: refereed > Publication in refereed journa

    Relative Well-Posedness of Constrained Systems with Applications to Variational Inequalities

    Full text link
    The paper concerns foundations of sensitivity and stability analysis, being primarily addressed constrained systems. We consider general models, which are described by multifunctions between Banach spaces and concentrate on characterizing their well-posedness properties that revolve around Lipschitz stability and metric regularity relative to sets. The enhanced relative well-posedness concepts allow us, in contrast to their standard counterparts, encompassing various classes of constrained systems. Invoking tools of variational analysis and generalized differentiation, we introduce new robust notions of relative coderivatives. The novel machinery of variational analysis leads us to establishing complete characterizations of the relative well-posedness properties with further applications to stability of affine variational inequalities. Most of the obtained results valid in general infinite-dimensional settings are also new in finite dimensions.Comment: 25 page

    An inexact regularized proximal Newton method for nonconvex and nonsmooth optimization

    Full text link
    This paper focuses on the minimization of a sum of a twice continuously differentiable function ff and a nonsmooth convex function. We propose an inexact regularized proximal Newton method by an approximation of the Hessian 2 ⁣f(x)\nabla^2\!f(x) involving the ϱ\varrhoth power of the KKT residual. For ϱ=0\varrho=0, we demonstrate the global convergence of the iterate sequence for the KL objective function and its RR-linear convergence rate for the KL objective function of exponent 1/21/2. For ϱ(0,1)\varrho\in(0,1), we establish the global convergence of the iterate sequence and its superlinear convergence rate of order q(1 ⁣+ ⁣ϱ)q(1\!+\!\varrho) under an assumption that cluster points satisfy a local H\"{o}lderian local error bound of order q(max(ϱ,11+ϱ),1]q\in(\max(\varrho,\frac{1}{1+\varrho}),1] on the strong stationary point set; and when cluster points satisfy a local error bound of order q>1+ϱq>1+\varrho on the common stationary point set, we also obtain the global convergence of the iterate sequence, and its superlinear convergence rate of order (qϱ)2q\frac{(q-\varrho)^2}{q} if q>2ϱ+1+4ϱ+12q>\frac{2\varrho+1+\sqrt{4\varrho+1}}{2}. A dual semismooth Newton augmented Lagrangian method is developed for seeking an inexact minimizer of subproblem. Numerical comparisons with two state-of-the-art methods on 1\ell_1-regularized Student's tt-regression, group penalized Student's tt-regression, and nonconvex image restoration confirm the efficiency of the proposed method
    corecore