1,876 research outputs found

    The performance of approximating ordinary differential equations by neural nets

    Get PDF
    The dynamics of many systems are described by ordinary differential equations (ODE). Solving ODEs with standard methods (i.e. numerical integration) needs a high amount of computing time but only a small amount of storage memory. For some applications, e.g. short time weather forecast or real time robot control, long computation times are prohibitive. Is there a method which uses less computing time (but has drawbacks in other aspects, e.g. memory), so that the computation of ODEs gets faster? We will try to discuss this question for the assumption that the alternative computation method is a neural network which was trained on ODE dynamics and compare both methods using the same approximation error. This comparison is done with two different errors. First, we use the standard error that measures the difference between the approximation and the solution of the ODE which is hard to characterize. But in many cases, as for physics engines used in computer games, the shape of the approximation curve is important and not the exact values of the approximation. Therefore, we introduce a subjective error based on the Total Least Square Error (TLSE) which gives more consistent results. For the final performance comparison, we calculate the optimal resource usage for the neural network and evaluate it depending on the resolution of the interpolation points and the inter-point distance. Our conclusion gives a method to evaluate where neural nets are advantageous over numerical ODE integration and where this is not the case. Index Terms—ODE, neural nets, Euler method, approximation complexity, storage optimization

    Adaptive modeling of biochemical pathways

    Get PDF
    In bioinformatics, biochemical pathways can be modeled by many differential equations. It is still an open problem how to fit the huge amount of parameters of the equations to the available data. Here, the approach of systematically learning the parameters is necessary. In this paper, for the small, important example of inflammation modeling a network is constructed and different learning algorithms are proposed. It turned out that due to the nonlinear dynamics evolutionary approaches are necessary to fit the parameters for sparse, given data. Proceedings of the 15th IEEE International Conference on Tools with Artificial Intelligence - ICTAI 200

    Using growing RBF-nets in rubber industry process control

    Get PDF
    This paper describes the use of a Radial Basis Function (RBF) neural network in the approximation of process parameters for the extrusion of a rubber profile in tyre production. After introducing the rubber industry problem, the RBF network model and the RBF net learning algorithm are developed, which uses a growing number of RBF units to compensate the approximation error up to the desired error limit. Its performance is shown for simple analytic examples. Then the paper describes the modelling of the industrial problem. Simulations show good results, even when using only a few training samples. The paper is concluded by a discussion of possible systematic error influences, improvements and potential generalisation benefits. Keywords: Adaptive process control; Parameter estimation; RBF-nets; Rubber extrusio

    Exploration and Implementation of Neural Ordinary Differential Equations

    Get PDF
    Neural ordinary differential equations (ODEs) have recently emerged as a novel ap- proach to deep learning, leveraging the knowledge of two previously separate domains, neural networks and differential equations. In this paper, we first examine the back- ground and lay the foundation for traditional artificial neural networks. We then present neural ODEs from a rigorous mathematical perspective, and explore their advantages and trade-offs compared to traditional neural nets
    corecore