4 research outputs found

    Numerical Methods That Preserve a Lyapunov Function for Ordinary Differential Equations

    Get PDF
    The paper studies numerical methods that preserve a Lyapunov function of a dynamical system, i.e., numerical approximations whose energy decreases, just like in the original differential equation. With this aim, a discrete gradient method is implemented for the numerical integration of a system of ordinary differential equations. In principle, this procedure yields first-order methods, but the analysis paves the way for the design of higher-order methods. As a case in point, the proposed method is applied to the Duffing equation without external forcing, considering that, in this case, preserving the Lyapunov function is more important than the accuracy of particular trajectories. Results are validated by means of numerical experiments, where the discrete gradient method is compared to standard Runge–Kutta methods. As predicted by the theory, discrete gradient methods preserve the Lyapunov function, whereas conventional methods fail to do so, since either periodic solutions appear or the energy does not decrease. Moreover, the discrete gradient method outperforms conventional schemes when these do preserve the Lyapunov function, in terms of computational cost; thus, the proposed method is promising.This work has been partially supported by Project PID2020-116898RB-I00 from the Ministerio de Ciencia e Innovación of Spain and Project UMA20-FEDERJA-045 from the Programa Operativo FEDER de Andalucía. Partial funding for open access charge: Universidad de Málag

    Continuous dynamical systems that realize discrete optimization on the hypercube

    Full text link
    We study the problem of finding a local minimum of a multilinear function E over the discrete set {0, 1}(n). The search is achieved by a gradient-like system in [0, 1](n) with cost function E. Under mild restrictions on the metric, the stable attractors of the gradient-like system are shown to produce solutions of the problem, even when they are not in the vicinity of the discrete set {0, 1}(n). Moreover, the gradient-like system connects with interior point methods for linear programming and with the analog neural network studied by Vidyasagar (IEEE Trans. Automat. Control 40 (8) (1995) 1359), in the same context. (C) 2004 Elsevier B.V. All rights reserved

    Continuous dynamical systems that realize discrete optimization on the hypercube

    No full text
    We study the problem of finding a local minimum of a multilinear function E over the discrete set {0,1}n. The search is achieved by a gradient-like system in [0,1]n with cost function E. Under mild restrictions on the metric, the stable attractors of the gradient-like system are shown to produce solutions of the problem, even when they are not in the vicinity of the discrete set {0,1}n. Moreover, the gradient-like system connects with interior point methods for linear programming and with the analog neural network studied by Vidyasagar (IEEE Trans. Automat. Control 40 (8) (1995) 1359), in the same context. © 2004 Elsevier B.V. All rights reserved

    Stima distribuita dello stato in reti di sensori

    Get PDF
    corecore