8 research outputs found

    First-order primal-dual methods for nonsmooth nonconvex optimisation

    Full text link
    We provide an overview of primal-dual algorithms for nonsmooth and non-convex-concave saddle-point problems. This flows around a new analysis of such methods, using Bregman divergences to formulate simplified conditions for convergence

    Generalized Dynamic Programming Principle and Sparse Mean-Field Control Problems

    Get PDF
    In this paper we study optimal control problems in Wasserstein spaces, which are suitable to describe macroscopic dynamics of multi-particle systems. The dynamics is described by a parametrized continuity equation, in which the Eulerian velocity field is affine w.r.t. some variables. Our aim is to minimize a cost functional which includes a control norm, thus enforcing a \emph{control sparsity} constraint. More precisely, we consider a nonlocal restriction on the total amount of control that can be used depending on the overall state of the evolving mass. We treat in details two main cases: an instantaneous constraint on the control applied to the evolving mass and a cumulative constraint, which depends also on the amount of control used in previous times. For both constraints, we prove the existence of optimal trajectories for general cost functions and that the value function is viscosity solution of a suitable Hamilton-Jacobi-Bellmann equation. Finally, we discuss an abstract Dynamic Programming Principle, providing further applications in the Appendix.Comment: This manuscript version is made available under the CC-BY-NC-ND 4.0 license http://creativecommons.org/licenses/by-nc-nd/4.0

    Introduction to Nonsmooth Analysis and Optimization

    Full text link
    This book aims to give an introduction to generalized derivative concepts useful in deriving necessary optimality conditions and numerical algorithms for infinite-dimensional nondifferentiable optimization problems that arise in inverse problems, imaging, and PDE-constrained optimization. They cover convex subdifferentials, Fenchel duality, monotone operators and resolvents, Moreau--Yosida regularization as well as Clarke and (briefly) limiting subdifferentials. Both first-order (proximal point and splitting) methods and second-order (semismooth Newton) methods are treated. In addition, differentiation of set-valued mapping is discussed and used for deriving second-order optimality conditions for as well as Lipschitz stability properties of minimizers. The required background from functional analysis and calculus of variations is also briefly summarized.Comment: arXiv admin note: substantial text overlap with arXiv:1708.0418
    corecore