195 research outputs found

    Second-order subdifferential calculus with applications to tilt stability in optimization

    Get PDF
    The paper concerns the second-order generalized differentiation theory of variational analysis and new applications of this theory to some problems of constrained optimization in finitedimensional spaces. The main attention is paid to the so-called (full and partial) second-order subdifferentials of extended-real-valued functions, which are dual-type constructions generated by coderivatives of frst-order subdifferential mappings. We develop an extended second-order subdifferential calculus and analyze the basic second-order qualification condition ensuring the fulfillment of the principal secondorder chain rule for strongly and fully amenable compositions. The calculus results obtained in this way and computing the second-order subdifferentials for piecewise linear-quadratic functions and their major specifications are applied then to the study of tilt stability of local minimizers for important classes of problems in constrained optimization that include, in particular, problems of nonlinear programming and certain classes of extended nonlinear programs described in composite terms

    Variational Geometric Approach To Generalized Differential And Conjugate Calculi In Convex Analysis

    Get PDF
    This paper develops a geometric approach of variational analysis for the case of convex objects considered in locally convex topological spaces and also in Banach space settings. Besides deriving in this way new results of convex calculus, we present an overview of some known achievements with their unified and simplified proofs based on the developed geometric variational schemes. Key words. Convex and variational analysis, Fenchel conjugates, normals and subgradients, coderivatives, convex calculus, optimal value functions

    Why second-order sufficient conditions are, in a way, easy -- or -- revisiting calculus for second subderivatives

    Full text link
    In this paper, we readdress the classical topic of second-order sufficient optimality conditions for optimization problems with nonsmooth structure. Based on the so-called second subderivative of the objective function and of the indicator function associated with the feasible set, one easily obtains second-order sufficient optimality conditions of abstract form. In order to exploit further structure of the problem, e.g., composite terms in the objective function or feasible sets given as (images of) pre-images of closed sets under smooth transformations, to make these conditions fully explicit, we study calculus rules for the second subderivative under mild conditions. To be precise, we investigate a chain rule and a marginal function rule, which then also give a pre-image and image rule, respectively. As it turns out, the chain rule and the pre-image rule yield lower estimates desirable in order to obtain sufficient optimality conditions for free. Similar estimates for the marginal function and the image rule are valid under a comparatively mild inner calmness* assumption. Our findings are illustrated by several examples including problems from composite, disjunctive, and nonlinear second-order cone programming.Comment: 43 page

    Variational Analysis Of Composite Optimization

    Get PDF
    The dissertation is devoted to the study of the first- and second-order variational analysis of the composite functions with applications to composite optimization. By considering a fairly general composite optimization problem, our analysis covers numerous classes of optimization problems such as constrained optimization; in particular, nonlinear programming, second-order cone programming and semidefinite programming(SDP). Beside constrained optimization problems our framework covers many important composite optimization problems such as the extended nonlinear programming and eigenvalue optimization problem. In first-order analysis we develop the exact first-order calculus via both subderivative and subdifferential. For the second-order part we develop calculus rules via second-order subderivative (which was a long standing open problem). Furthermore, we establish twice epi-differentiability of composite functions. Then we apply our results to composite optimization problem to obtain first- and second-order order optimality conditions under the weakest constraint qualification, the metric subregularity constraint qualification. Finally we apply our results to verify the super linear convergence in SQP methods for constrained optimization
    corecore