14 research outputs found

    Formal Synthesis of Lyapunov Neural Networks

    Full text link
    We propose an automatic and formally sound method for synthesising Lyapunov functions for the asymptotic stability of autonomous non-linear systems. Traditional methods are either analytical and require manual effort or are numerical but lack of formal soundness. Symbolic computational methods for Lyapunov functions, which are in between, give formal guarantees but are typically semi-automatic because they rely on the user to provide appropriate function templates. We propose a method that finds Lyapunov functions fully automatically-using machine learning-while also providing formal guarantees-using satisfiability modulo theories (SMT). We employ a counterexample-guided approach where a numerical learner and a symbolic verifier interact to construct provably correct Lyapunov neural networks (LNNs). The learner trains a neural network that satisfies the Lyapunov criteria for asymptotic stability over a samples set; the verifier proves via SMT solving that the criteria are satisfied over the whole domain or augments the samples set with counterexamples. Our method supports neural networks with polynomial activation functions and multiple depth and width, which display wide learning capabilities. We demonstrate our method over several non-trivial benchmarks and compare it favourably against a numerical optimisation-based approach, a symbolic template-based approach, and a cognate LNN-based approach. Our method synthesises Lyapunov functions faster and over wider spatial domains than the alternatives, yet providing stronger or equal guarantees

    Lyapunov function search method for analysis of nonlinear systems stability using genetic algorithm

    Full text link
    This paper considers a wide class of smooth continuous dynamic nonlinear systems (control objects) with a measurable vector of state. The problem is to find a special function (Lyapunov function), which in the framework of the second Lyapunov method guarantees asymptotic stability for the above described class of nonlinear systems. It is well known that the search for a Lyapunov function is the "cornerstone" of mathematical stability theory. Methods for selecting or finding the Lyapunov function to analyze the stability of closed linear stationary systems, as well as for nonlinear objects with explicit linear dynamic and nonlinear static parts, have been well studied (see works by Lurie, Yakubovich, Popov, and many others). However, universal approaches to the search for the Lyapunov function for a more general class of nonlinear systems have not yet been identified. There is a large variety of methods for finding the Lyapunov function for nonlinear systems, but they all operate within the constraints imposed on the structure of the control object. In this paper we propose another approach, which allows to give specialists in the field of automatic control theory a new tool/mechanism of Lyapunov function search for stability analysis of smooth continuous dynamic nonlinear systems with measurable state vector. The essence of proposed approach consists in representation of some function through sum of nonlinear terms, which are elements of object's state vector, multiplied by unknown coefficients, raised to positive degrees. Then the unknown coefficients are selected using genetic algorithm, which should provide the function with all necessary conditions for Lyapunov function (in the framework of the second Lyapunov method).Comment: in Russian languag

    Linear Relaxations of Polynomial Positivity for Polynomial Lyapunov Function Synthesis

    No full text
    We examine linear programming (LP) based relaxations for synthesizing polynomial Lyapunov functions to prove the stability of polynomial ordinary differential equations (ODEs). Our approach starts from a desired parametric polynomial form of the polynomial Lyapunov function. Subsequently, we encode the positive definiteness of the function, and the negation of its derivative, over the domain of interest. We first compare two classes of relaxations for encoding polynomial positivity: relaxations by sum-of-squares (SOS) programmes, against relaxations based on Handelman representations and Bernstein polynomials, that produce linear programmes. Next, we present a series of increasingly powerful LP relaxations based on expressing the given polynomial in its Bernstein form, as a linear combination of Bernstein polynomials. Subsequently, we show how these LP relaxations can be used to search for Lyapunov functions for polynomial ODEs by formulating LP instances. We compare our techniques with approaches based on SOS on a suite of automatically synthesized benchmarks
    corecore