91 research outputs found
Second-order subdifferential calculus with applications to tilt stability in optimization
The paper concerns the second-order generalized differentiation theory of
variational analysis and new applications of this theory to some problems of
constrained optimization in finitedimensional spaces. The main attention is
paid to the so-called (full and partial) second-order subdifferentials of
extended-real-valued functions, which are dual-type constructions generated by
coderivatives of frst-order subdifferential mappings. We develop an extended
second-order subdifferential calculus and analyze the basic second-order
qualification condition ensuring the fulfillment of the principal secondorder
chain rule for strongly and fully amenable compositions. The calculus results
obtained in this way and computing the second-order subdifferentials for
piecewise linear-quadratic functions and their major specifications are applied
then to the study of tilt stability of local minimizers for important classes
of problems in constrained optimization that include, in particular, problems
of nonlinear programming and certain classes of extended nonlinear programs
described in composite terms
Tilt stability, uniform quadratic growth, and strong metric regularity of the subdifferential
We prove that uniform second order growth, tilt stability, and strong metric
regularity of the limiting subdifferential --- three notions that have appeared
in entirely different settings --- are all essentially equivalent for any
lower-semicontinuous, extended-real-valued function.Comment: 12 page
On partial smoothness, tilt stability and the VU-decomposition
Under the assumption of prox-regularity and the presence of a tilt stable local minimum we are able to show that a (Formula presented.) like decomposition gives rise to the existence of a smooth manifold on which the function in question coincides locally with a smooth function
Model Consistency of Partly Smooth Regularizers
This paper studies least-square regression penalized with partly smooth
convex regularizers. This class of functions is very large and versatile
allowing to promote solutions conforming to some notion of low-complexity.
Indeed, they force solutions of variational problems to belong to a
low-dimensional manifold (the so-called model) which is stable under small
perturbations of the function. This property is crucial to make the underlying
low-complexity model robust to small noise. We show that a generalized
"irrepresentable condition" implies stable model selection under small noise
perturbations in the observations and the design matrix, when the
regularization parameter is tuned proportionally to the noise level. This
condition is shown to be almost a necessary condition. We then show that this
condition implies model consistency of the regularized estimator. That is, with
a probability tending to one as the number of measurements increases, the
regularized estimator belongs to the correct low-dimensional model manifold.
This work unifies and generalizes several previous ones, where model
consistency is known to hold for sparse, group sparse, total variation and
low-rank regularizations
A Multi-step Inertial Forward--Backward Splitting Method for Non-convex Optimization
In this paper, we propose a multi-step inertial Forward--Backward splitting
algorithm for minimizing the sum of two non-necessarily convex functions, one
of which is proper lower semi-continuous while the other is differentiable with
a Lipschitz continuous gradient. We first prove global convergence of the
scheme with the help of the Kurdyka-{\L}ojasiewicz property. Then, when the
non-smooth part is also partly smooth relative to a smooth submanifold, we
establish finite identification of the latter and provide sharp local linear
convergence analysis. The proposed method is illustrated on a few problems
arising from statistics and machine learning.Comment: This paper is in company with our recent work on
Forward--Backward-type splitting methods http://arxiv.org/abs/1503.0370
- …