12,286 research outputs found

    Separable Hamiltonian Neural Networks

    Full text link
    The modelling of dynamical systems from discrete observations is a challenge faced by modern scientific and engineering data systems. Hamiltonian systems are one such fundamental and ubiquitous class of dynamical systems. Hamiltonian neural networks are state-of-the-art models that unsupervised-ly regress the Hamiltonian of a dynamical system from discrete observations of its vector field under the learning bias of Hamilton's equations. Yet Hamiltonian dynamics are often complicated, especially in higher dimensions where the state space of the Hamiltonian system is large relative to the number of samples. A recently discovered remedy to alleviate the complexity between state variables in the state space is to leverage the additive separability of the Hamiltonian system and embed that additive separability into the Hamiltonian neural network. Following the nomenclature of physics-informed machine learning, we propose three separable Hamiltonian neural networks. These models embed additive separability within Hamiltonian neural networks. The first model uses additive separability to quadratically scale the amount of data for training Hamiltonian neural networks. The second model embeds additive separability within the loss function of the Hamiltonian neural network. The third model embeds additive separability through the architecture of the Hamiltonian neural network using conjoined multilayer perceptions. We empirically compare the three models against state-of-the-art Hamiltonian neural networks, and demonstrate that the separable Hamiltonian neural networks, which alleviate complexity between the state variables, are more effective at regressing the Hamiltonian and its vector field.Comment: 11 page

    Applications of Machine Learning to Modelling and Analysing Dynamical Systems

    Full text link
    We explore the use of Physics Informed Neural Networks to analyse nonlinear Hamiltonian Dynamical Systems with a first integral of motion. In this work, we propose an architecture which combines existing Hamiltonian Neural Network structures into Adaptable Symplectic Recurrent Neural Networks which preserve Hamilton's equations as well as the symplectic structure of phase space while predicting dynamics for the entire parameter space. This architecture is found to significantly outperform previously proposed neural networks when predicting Hamiltonian dynamics especially in potentials which contain multiple parameters. We demonstrate its robustness using the nonlinear Henon-Heiles potential under chaotic, quasiperiodic and periodic conditions. The second problem we tackle is whether we can use the high dimensional nonlinear capabilities of neural networks to predict the dynamics of a Hamiltonian system given only partial information of the same. Hence we attempt to take advantage of Long Short Term Memory networks to implement Takens' embedding theorem and construct a delay embedding of the system followed by mapping the topologically invariant attractor to the true form. This architecture is then layered with Adaptable Symplectic nets to allow for predictions which preserve the structure of Hamilton's equations. We show that this method works efficiently for single parameter potentials and provides accurate predictions even over long periods of time.Comment: This is a dissertation submitted in partial fulfilment of the requirements for the degree of Bachelor of Science (Honours) Physics at St. Stephens College University of Delhi in 2023. The dissertation was guided by Dr. Abhinav Gupta, Associate Professor, Department of Physics, St. Stephens College Delh

    IST Austria Thesis

    Get PDF
    In this Thesis, I study composite quantum impurities with variational techniques, both inspired by machine learning as well as fully analytic. I supplement this with exploration of other applications of machine learning, in particular artificial neural networks, in many-body physics. In Chapters 3 and 4, I study quasiparticle systems with variational approach. I derive a Hamiltonian describing the angulon quasiparticle in the presence of a magnetic field. I apply analytic variational treatment to this Hamiltonian. Then, I introduce a variational approach for non-additive systems, based on artificial neural networks. I exemplify this approach on the example of the polaron quasiparticle (Fröhlich Hamiltonian). In Chapter 5, I continue using artificial neural networks, albeit in a different setting. I apply artificial neural networks to detect phases from snapshots of two types physical systems. Namely, I study Monte Carlo snapshots of multilayer classical spin models as well as molecular dynamics maps of colloidal systems. The main type of networks that I use here are convolutional neural networks, known for their applicability to image data

    A spherical Hopfield model

    Full text link
    We introduce a spherical Hopfield-type neural network involving neurons and patterns that are continuous variables. We study both the thermodynamics and dynamics of this model. In order to have a retrieval phase a quartic term is added to the Hamiltonian. The thermodynamics of the model is exactly solvable and the results are replica symmetric. A Langevin dynamics leads to a closed set of equations for the order parameters and effective correlation and response function typical for neural networks. The stationary limit corresponds to the thermodynamic results. Numerical calculations illustrate our findings.Comment: 9 pages Latex including 3 eps figures, Addition of an author in the HTML-abstract unintentionally forgotten, no changes to the manuscrip

    Thermostat-assisted continuously-tempered Hamiltonian Monte Carlo for Bayesian learning

    Get PDF
    We propose a new sampling method, the thermostat-assisted continuously-tempered Hamiltonian Monte Carlo, for Bayesian learning on large datasets and multimodal distributions. It simulates the Nos\'e-Hoover dynamics of a continuously-tempered Hamiltonian system built on the distribution of interest. A significant advantage of this method is that it is not only able to efficiently draw representative i.i.d. samples when the distribution contains multiple isolated modes, but capable of adaptively neutralising the noise arising from mini-batches and maintaining accurate sampling. While the properties of this method have been studied using synthetic distributions, experiments on three real datasets also demonstrated the gain of performance over several strong baselines with various types of neural networks plunged in

    General Neural Networks Dynamics are a Superposition of Gradient-like and Hamiltonian-like Systems

    Get PDF
    This report presents a formalism that enables the dynamics of a broad class of neural networks to be understood. A number of previous works have analyzed the Lyapunov stability of neural network models. This type of analysis shows that the excursion of the solutions from a stable point is bounded. The purpose of this work is to present a model of the dynamics that also describes the phase space behavior as well as the structural stability of the system. This is achieved by writing the general equations of the neural network dynamics as the sum of gradient-like and Hamiltonian-like systems. In this paper some important properties of both gradient-like and Hamiltonian-like systems are developed and then it is demonstrated that a broad class of neural network models are expressible in this form
    • …
    corecore