4,779 research outputs found
Second-order subdifferential calculus with applications to tilt stability in optimization
The paper concerns the second-order generalized differentiation theory of
variational analysis and new applications of this theory to some problems of
constrained optimization in finitedimensional spaces. The main attention is
paid to the so-called (full and partial) second-order subdifferentials of
extended-real-valued functions, which are dual-type constructions generated by
coderivatives of frst-order subdifferential mappings. We develop an extended
second-order subdifferential calculus and analyze the basic second-order
qualification condition ensuring the fulfillment of the principal secondorder
chain rule for strongly and fully amenable compositions. The calculus results
obtained in this way and computing the second-order subdifferentials for
piecewise linear-quadratic functions and their major specifications are applied
then to the study of tilt stability of local minimizers for important classes
of problems in constrained optimization that include, in particular, problems
of nonlinear programming and certain classes of extended nonlinear programs
described in composite terms
Stability and Error Analysis for Optimization and Generalized Equations
Stability and error analysis remain challenging for problems that lack
regularity properties near solutions, are subject to large perturbations, and
might be infinite dimensional. We consider nonconvex optimization and
generalized equations defined on metric spaces and develop bounds on solution
errors using the truncated Hausdorff distance applied to graphs and epigraphs
of the underlying set-valued mappings and functions. In the process, we extend
the calculus of such distances to cover compositions and other constructions
that arise in nonconvex problems. The results are applied to constrained
problems with feasible sets that might have empty interiors, solution of KKT
systems, and optimality conditions for difference-of-convex functions and
composite functions
Lagrange optimality system for a class of nonsmooth convex optimization
In this paper, we revisit the augmented Lagrangian method for a class of
nonsmooth convex optimization. We present the Lagrange optimality system of the
augmented Lagrangian associated with the problems, and establish its
connections with the standard optimality condition and the saddle point
condition of the augmented Lagrangian, which provides a powerful tool for
developing numerical algorithms. We apply a linear Newton method to the
Lagrange optimality system to obtain a novel algorithm applicable to a variety
of nonsmooth convex optimization problems arising in practical applications.
Under suitable conditions, we prove the nonsingularity of the Newton system and
the local convergence of the algorithm.Comment: 19 page
- …