49 research outputs found
Second-order subdifferential calculus with applications to tilt stability in optimization
The paper concerns the second-order generalized differentiation theory of
variational analysis and new applications of this theory to some problems of
constrained optimization in finitedimensional spaces. The main attention is
paid to the so-called (full and partial) second-order subdifferentials of
extended-real-valued functions, which are dual-type constructions generated by
coderivatives of frst-order subdifferential mappings. We develop an extended
second-order subdifferential calculus and analyze the basic second-order
qualification condition ensuring the fulfillment of the principal secondorder
chain rule for strongly and fully amenable compositions. The calculus results
obtained in this way and computing the second-order subdifferentials for
piecewise linear-quadratic functions and their major specifications are applied
then to the study of tilt stability of local minimizers for important classes
of problems in constrained optimization that include, in particular, problems
of nonlinear programming and certain classes of extended nonlinear programs
described in composite terms
A Generalized Newton Method for Subgradient Systems
This paper proposes and develops a new Newton-type algorithm to solve
subdifferential inclusions defined by subgradients of extended-real-valued
prox-regular functions. The proposed algorithm is formulated in terms of the
second-order subdifferential of such functions that enjoys extensive calculus
rules and can be efficiently computed for broad classes of extended-real-valued
functions. Based on this and on metric regularity and subregularity properties
of subgradient mappings, we establish verifiable conditions ensuring
well-posedness of the proposed algorithm and its local superlinear convergence.
The obtained results are also new for the class of equations defined by
continuously differentiable functions with Lipschitzian derivatives
( functions), which is the underlying case of our
consideration. The developed algorithm for prox-regular functions is formulated
in terms of proximal mappings related to and reduces to Moreau envelopes.
Besides numerous illustrative examples and comparison with known algorithms for
functions and generalized equations, the paper presents
applications of the proposed algorithm to the practically important class of
Lasso problems arising in statistics and machine learning.Comment: 35 page
Globally Convergent Coderivative-Based Generalized Newton Methods in Nonsmooth Optimization
This paper proposes and justifies two globally convergent Newton-type methods
to solve unconstrained and constrained problems of nonsmooth optimization by
using tools of variational analysis and generalized differentiation. Both
methods are coderivative-based and employ generalized Hessians (coderivatives
of subgradient mappings) associated with objective functions, which are either
of class , or are represented in the form of convex
composite optimization, where one of the terms may be extended-real-valued. The
proposed globally convergent algorithms are of two types. The first one extends
the damped Newton method and requires positive-definiteness of the generalized
Hessians for its well-posedness and efficient performance, while the other
algorithm is of {the regularized Newton type} being well-defined when the
generalized Hessians are merely positive-semidefinite. The obtained convergence
rates for both methods are at least linear, but become superlinear under the
semismooth property of subgradient mappings. Problems of convex composite
optimization are investigated with and without the strong convexity assumption
{on smooth parts} of objective functions by implementing the machinery of
forward-backward envelopes. Numerical experiments are conducted for Lasso
problems and for box constrained quadratic programs with providing performance
comparisons of the new algorithms and some other first-order and second-order
methods that are highly recognized in nonsmooth optimization.Comment: arXiv admin note: text overlap with arXiv:2101.1055
Local strong maximal monotonicity and full stability for parametric variational systems
The paper introduces and characterizes new notions of Lipschitzian and
H\"olderian full stability of solutions to general parametric variational
systems described via partial subdifferential and normal cone mappings acting
in Hilbert spaces. These notions, postulated certain quantitative properties of
single-valued localizations of solution maps, are closely related to local
strong maximal monotonicity of associated set-valued mappings. Based on
advanced tools of variational analysis and generalized differentiation, we
derive verifiable characterizations of the local strong maximal monotonicity
and full stability notions under consideration via some positive-definiteness
conditions involving second-order constructions of variational analysis. The
general results obtained are specified for important classes of variational
inequalities and variational conditions in both finite and infinite dimensions