816 research outputs found

    Introduction to Nonsmooth Analysis and Optimization

    Full text link
    This book aims to give an introduction to generalized derivative concepts useful in deriving necessary optimality conditions and numerical algorithms for infinite-dimensional nondifferentiable optimization problems that arise in inverse problems, imaging, and PDE-constrained optimization. They cover convex subdifferentials, Fenchel duality, monotone operators and resolvents, Moreau--Yosida regularization as well as Clarke and (briefly) limiting subdifferentials. Both first-order (proximal point and splitting) methods and second-order (semismooth Newton) methods are treated. In addition, differentiation of set-valued mapping is discussed and used for deriving second-order optimality conditions for as well as Lipschitz stability properties of minimizers. The required background from functional analysis and calculus of variations is also briefly summarized.Comment: arXiv admin note: substantial text overlap with arXiv:1708.0418

    Deflation for semismooth equations

    Full text link
    Variational inequalities can in general support distinct solutions. In this paper we study an algorithm for computing distinct solutions of a variational inequality, without varying the initial guess supplied to the solver. The central idea is the combination of a semismooth Newton method with a deflation operator that eliminates known solutions from consideration. Given one root of a semismooth residual, deflation constructs a new problem for which a semismooth Newton method will not converge to the known root, even from the same initial guess. This enables the discovery of other roots. We prove the effectiveness of the deflation technique under the same assumptions that guarantee locally superlinear convergence of a semismooth Newton method. We demonstrate its utility on various finite- and infinite-dimensional examples drawn from constrained optimization, game theory, economics and solid mechanics.Comment: 24 pages, 3 figure

    A Semismooth Newton Method for Tensor Eigenvalue Complementarity Problem

    Full text link
    In this paper, we consider the tensor eigenvalue complementarity problem which is closely related to the optimality conditions for polynomial optimization, as well as a class of differential inclusions with nonconvex processes. By introducing an NCP-function, we reformulate the tensor eigenvalue complementarity problem as a system of nonlinear equations. We show that this function is strongly semismooth but not differentiable, in which case the classical smoothing methods cannot apply. Furthermore, we propose a damped semismooth Newton method for tensor eigenvalue complementarity problem. A new procedure to evaluate an element of the generalized Jocobian is given, which turns out to be an element of the B-subdifferential under mild assumptions. As a result, the convergence of the damped semismooth Newton method is guaranteed by existing results. The numerical experiments also show that our method is efficient and promising
    corecore