1,198 research outputs found

    Sequential Convex Programming Methods for Solving Nonlinear Optimization Problems with DC constraints

    Full text link
    This paper investigates the relation between sequential convex programming (SCP) as, e.g., defined in [24] and DC (difference of two convex functions) programming. We first present an SCP algorithm for solving nonlinear optimization problems with DC constraints and prove its convergence. Then we combine the proposed algorithm with a relaxation technique to handle inconsistent linearizations. Numerical tests are performed to investigate the behaviour of the class of algorithms.Comment: 18 pages, 1 figur

    Some Applications of Polynomial Optimization in Operations Research and Real-Time Decision Making

    Full text link
    We demonstrate applications of algebraic techniques that optimize and certify polynomial inequalities to problems of interest in the operations research and transportation engineering communities. Three problems are considered: (i) wireless coverage of targeted geographical regions with guaranteed signal quality and minimum transmission power, (ii) computing real-time certificates of collision avoidance for a simple model of an unmanned vehicle (UV) navigating through a cluttered environment, and (iii) designing a nonlinear hovering controller for a quadrotor UV, which has recently been used for load transportation. On our smaller-scale applications, we apply the sum of squares (SOS) relaxation and solve the underlying problems with semidefinite programming. On the larger-scale or real-time applications, we use our recently introduced "SDSOS Optimization" techniques which result in second order cone programs. To the best of our knowledge, this is the first study of real-time applications of sum of squares techniques in optimization and control. No knowledge in dynamics and control is assumed from the reader

    Recent Advances in Computational Methods for the Power Flow Equations

    Get PDF
    The power flow equations are at the core of most of the computations for designing and operating electric power systems. The power flow equations are a system of multivariate nonlinear equations which relate the power injections and voltages in a power system. A plethora of methods have been devised to solve these equations, starting from Newton-based methods to homotopy continuation and other optimization-based methods. While many of these methods often efficiently find a high-voltage, stable solution due to its large basin of attraction, most of the methods struggle to find low-voltage solutions which play significant role in certain stability-related computations. While we do not claim to have exhausted the existing literature on all related methods, this tutorial paper introduces some of the recent advances in methods for solving power flow equations to the wider power systems community as well as bringing attention from the computational mathematics and optimization communities to the power systems problems. After briefly reviewing some of the traditional computational methods used to solve the power flow equations, we focus on three emerging methods: the numerical polynomial homotopy continuation method, Groebner basis techniques, and moment/sum-of-squares relaxations using semidefinite programming. In passing, we also emphasize the importance of an upper bound on the number of solutions of the power flow equations and review the current status of research in this direction.Comment: 13 pages, 2 figures. Submitted to the Tutorial Session at IEEE 2016 American Control Conferenc

    Linearly Solvable Stochastic Control Lyapunov Functions

    Get PDF
    This paper presents a new method for synthesizing stochastic control Lyapunov functions for a class of nonlinear stochastic control systems. The technique relies on a transformation of the classical nonlinear Hamilton-Jacobi-Bellman partial differential equation to a linear partial differential equation for a class of problems with a particular constraint on the stochastic forcing. This linear partial differential equation can then be relaxed to a linear differential inclusion, allowing for relaxed solutions to be generated using sum of squares programming. The resulting relaxed solutions are in fact viscosity super/subsolutions, and by the maximum principle are pointwise upper and lower bounds to the underlying value function, even for coarse polynomial approximations. Furthermore, the pointwise upper bound is shown to be a stochastic control Lyapunov function, yielding a method for generating nonlinear controllers with pointwise bounded distance from the optimal cost when using the optimal controller. These approximate solutions may be computed with non-increasing error via a hierarchy of semidefinite optimization problems. Finally, this paper develops a-priori bounds on trajectory suboptimality when using these approximate value functions, as well as demonstrates that these methods, and bounds, can be applied to a more general class of nonlinear systems not obeying the constraint on stochastic forcing. Simulated examples illustrate the methodology.Comment: Published in SIAM Journal of Control and Optimizatio
    • …
    corecore