114 research outputs found

    Balanced data assimilation for highly-oscillatory mechanical systems

    Get PDF
    Data assimilation algorithms are used to estimate the states of a dynamical system using partial and noisy observations. The ensemble Kalman filter has become a popular data assimilation scheme due to its simplicity and robustness for a wide range of application areas. Nevertheless, the ensemble Kalman filter also has limitations due to its inherent Gaussian and linearity assumptions. These limitations can manifest themselves in dynamically inconsistent state estimates. We investigate this issue in this paper for highly oscillatory Hamiltonian systems with a dynamical behavior which satisfies certain balance relations. We first demonstrate that the standard ensemble Kalman filter can lead to estimates which do not satisfy those balance relations, ultimately leading to filter divergence. We also propose two remedies for this phenomenon in terms of blended time-stepping schemes and ensemble-based penalty methods. The effect of these modifications to the standard ensemble Kalman filter are discussed and demonstrated numerically for two model scenarios. First, we consider balanced motion for highly oscillatory Hamiltonian systems and, second, we investigate thermally embedded highly oscillatory Hamiltonian systems. The first scenario is relevant for applications from meteorology while the second scenario is relevant for applications of data assimilation to molecular dynamics

    Optimization on manifolds: A symplectic approach

    Full text link
    There has been great interest in using tools from dynamical systems and numerical analysis of differential equations to understand and construct new optimization methods. In particular, recently a new paradigm has emerged that applies ideas from mechanics and geometric integration to obtain accelerated optimization methods on Euclidean spaces. This has important consequences given that accelerated methods are the workhorses behind many machine learning applications. In this paper we build upon these advances and propose a framework for dissipative and constrained Hamiltonian systems that is suitable for solving optimization problems on arbitrary smooth manifolds. Importantly, this allows us to leverage the well-established theory of symplectic integration to derive "rate-matching" dissipative integrators. This brings a new perspective to optimization on manifolds whereby convergence guarantees follow by construction from classical arguments in symplectic geometry and backward error analysis. Moreover, we construct two dissipative generalizations of leapfrog that are straightforward to implement: one for Lie groups and homogeneous spaces, that relies on the tractable geodesic flow or a retraction thereof, and the other for constrained submanifolds that is based on a dissipative generalization of the famous RATTLE integrator

    Multi-symplectic discretisation of wave map equations

    Full text link
    We present a new multi-symplectic formulation of constrained Hamiltonian partial differential equations, and we study the associated local conservation laws. A multi-symplectic discretisation based on this new formulation is exemplified by means of the Euler box scheme. When applied to the wave map equation, this numerical scheme is explicit, preserves the constraint and can be seen as a generalisation of the Shake algorithm for constrained mechanical systems. Furthermore, numerical experiments show excellent conservation properties of the numerical solutions

    Discrete mechanics and variational integrators

    Get PDF
    This paper gives a review of integration algorithms for finite dimensional mechanical systems that are based on discrete variational principles. The variational technique gives a unified treatment of many symplectic schemes, including those of higher order, as well as a natural treatment of the discrete Noether theorem. The approach also allows us to include forces, dissipation and constraints in a natural way. Amongst the many specific schemes treated as examples, the Verlet, SHAKE, RATTLE, Newmark, and the symplectic partitioned Runge–Kutta schemes are presented

    Optimization via conformal Hamiltonian systems on manifolds

    Full text link
    In this work we propose a method to perform optimization on manifolds. We assume to have an objective function ff defined on a manifold and think of it as the potential energy of a mechanical system. By adding a momentum-dependent kinetic energy we define its Hamiltonian function, which allows us to write the corresponding Hamiltonian system. We make it conformal by introducing a dissipation term: the result is the continuous model of our scheme. We solve it via splitting methods (Lie-Trotter and leapfrog): we combine the RATTLE scheme, approximating the conserved flow, with the exact dissipated flow. The result is a conformal symplectic method for constant stepsizes. We also propose an adaptive stepsize version of it. We test it on an example, the minimization of a function defined on a sphere, and compare it with the usual gradient descent method.Comment: 21 pages, 6 figures, 1 page. Presented at GSI conference 202

    Mechanical Systems with Symmetry, Variational Principles, and Integration Algorithms

    Get PDF
    This paper studies variational principles for mechanical systems with symmetry and their applications to integration algorithms. We recall some general features of how to reduce variational principles in the presence of a symmetry group along with general features of integration algorithms for mechanical systems. Then we describe some integration algorithms based directly on variational principles using a discretization technique of Veselov. The general idea for these variational integrators is to directly discretize Hamilton’s principle rather than the equations of motion in a way that preserves the original systems invariants, notably the symplectic form and, via a discrete version of Noether’s theorem, the momentum map. The resulting mechanical integrators are second-order accurate, implicit, symplectic-momentum algorithms. We apply these integrators to the rigid body and the double spherical pendulum to show that the techniques are competitive with existing integrators

    Stochastic Variational Integrators

    Full text link
    This paper presents a continuous and discrete Lagrangian theory for stochastic Hamiltonian systems on manifolds. The main result is to derive stochastic governing equations for such systems from a critical point of a stochastic action. Using this result the paper derives Langevin-type equations for constrained mechanical systems and implements a stochastic analog of Lagrangian reduction. These are easy consequences of the fact that the stochastic action is intrinsically defined. Stochastic variational integrators (SVIs) are developed using a discretized stochastic variational principle. The paper shows that the discrete flow of an SVI is a.s. symplectic and in the presence of symmetry a.s. momentum-map preserving. A first-order mean-square convergent SVI for mechanical systems on Lie groups is introduced. As an application of the theory, SVIs are exhibited for multiple, randomly forced and torqued rigid-bodies interacting via a potential.Comment: 21 pages, 8 figure

    Nonholonomic Dynamics

    Get PDF
    Nonholonomic systems are, roughly speaking, mechanical systems with constraints on their velocity that are not derivable from position constraints. They arise, for instance, in mechanical systems that have rolling contact (for example, the rolling of wheels without slipping) or certain kinds of sliding contact (such as the sliding of skates). They are a remarkable generalization of classical Lagrangian and Hamiltonian systems in which one allows position constraints only. There are some fascinating differences between nonholonomic systems and classical Hamiltonian or Lagrangian systems. Among other things: nonholonomic systems are nonvariational—they arise from the Lagrange-d’Alembert principle and not from Hamilton’s principle; while energy is preserved for nonholonomic systems, momentum is not always preserved for systems with symmetry (i.e., there is nontrivial dynamics associated with the nonholonomic generalization of Noether’s theorem); nonholonomic systems are almost Poisson but not Poisson (i.e., there is a bracket that together with the energy on the phase space defines the motion, but the bracket generally does not satisfy the Jacobi identity); and finally, unlike the Hamiltonian setting, volume may not be preserved in the phase space, leading to interesting asymptotic stability in some cases, despite energy conservation. The purpose of this article is to engage the reader’s interest by highlighting some of these differences along with some current research in the area. There has been some confusion in the literature for quite some time over issues such as the variational character of nonholonomic systems, so it is appropriate that we begin with a brief review of the history of the subject

    Asynchronous Variational Contact Mechanics

    Full text link
    An asynchronous, variational method for simulating elastica in complex contact and impact scenarios is developed. Asynchronous Variational Integrators (AVIs) are extended to handle contact forces by associating different time steps to forces instead of to spatial elements. By discretizing a barrier potential by an infinite sum of nested quadratic potentials, these extended AVIs are used to resolve contact while obeying momentum- and energy-conservation laws. A series of two- and three-dimensional examples illustrate the robustness and good energy behavior of the method
    corecore