7 research outputs found
Newton's Method for Solving Inclusions Using Set-Valued Approximations
International audienceResults on stability of both local and global metric regularity under set-valued perturbations are presented. As an application, we study (super)linear convergence of a Newton- type iterative process for solving generalized equations. We investigate several iterative schemes such as the inexact Newton’s method, the nonsmooth Newton’s method for semismooth functions, the inexact proximal point algorithm, etc. Moreover, we also cover a forward-backward splitting algorithm for finding a zero of the sum of two multivalued (not necessarily monotone) operators. Finally, a globalization of the Newton’s method is discussed
Inexact Newton Methods for Solving Generalized Equations on Riemannian Manifolds
The local convergence of an inexact Newton method is studied for solving
generalized equations on Riemannian manifolds by using the metric regularity
property which is explored as well. Under suitable conditions and without any
additional geometric assumptions, local convergence results with linear and
quadratic rate and a semi-local convergence result are obtained for the
proposed method. Finally, the theory can be applied to problems of finding a
singularity of the sum of two vector fields.Comment: 34 page
Morceaux Choisis en Optimisation Continue et sur les Systèmes non Lisses
MasterThis course starts with the presentation of the optimality conditions of an optimization problem described in a rather abstract manner, so that these can be useful for dealing with a large variety of problems. Next, the course describes and analyzes various advanced algorithms to solve optimization problems (nonsmooth methods, linearization methods, proximal and augmented Lagrangian methods, interior point methods) and shows how they can be used to solve a few classical optimization problems (linear optimization, convex quadratic optimization, semidefinite optimization (SDO), nonlinear optimization). Along the way, various tools from convex and nonsmooth analysis will be presented. Everything is conceptualized in finite dimension. The goal of the lectures is therefore to consolidate basic knowledge in optimization, on both theoretical and algorithmic aspects
Inexact Josephy–Newton framework for generalized equations and its applications to local analysis of Newtonian methods for constrained optimization
Newton method, Josephy–Newton method, Generalized equation, Variational problem, Linearly constrained Lagrangian method, (Stabilized) sequential quadratic programming,