12 research outputs found
On Matching, and Even Rectifying, Dynamical Systems through Koopman Operator Eigenfunctions
Matching dynamical systems, through different forms of conjugacies and
equivalences, has long been a fundamental concept, and a powerful tool, in the
study and classification of nonlinear dynamic behavior (e.g. through normal
forms). In this paper we will argue that the use of the Koopman operator and
its spectrum is particularly well suited for this endeavor, both in theory, but
also especially in view of recent data-driven algorithm developments. We
believe, and document through illustrative examples, that this can nontrivially
extend the use and applicability of the Koopman spectral theoretical and
computational machinery beyond modeling and prediction, towards what can be
considered as a systematic discovery of "Cole-Hopf-type" transformations for
dynamics.Comment: 34 pages, 10 figure
Path-Integral Formula for Computing Koopman Eigenfunctions
The paper is about the computation of the principal spectrum of the Koopman
operator (i.e., eigenvalues and eigenfunctions). The principal eigenfunctions
of the Koopman operator are the ones with the corresponding eigenvalues equal
to the eigenvalues of the linearization of the nonlinear system at an
equilibrium point. The main contribution of this paper is to provide a novel
approach for computing the principal eigenfunctions using a path-integral
formula. Furthermore, we provide conditions based on the stability property of
the dynamical system and the eigenvalues of the linearization towards computing
the principal eigenfunction using the path-integral formula. Further, we
provide a Deep Neural Network framework that utilizes our proposed
path-integral approach for eigenfunction computation in high-dimension systems.
Finally, we present simulation results for the computation of principal
eigenfunction and demonstrate their application for determining the stable and
unstable manifolds and constructing the Lyapunov function
Learning Koopman eigenfunctions for prediction and control: the transient case
This work presents a data-driven framework for learning eigenfunctions of the
Koopman operator geared toward prediction and control. The method relies on the
richness of the spectrum of the Koopman operator in the transient,
off-attractor, regime to construct a large number of eigenfunctions such that
the state (or any other observable quantity of interest) is in the span of
these eigenfunctions and hence predictable in a linear fashion. Once a
predictor for the uncontrolled part of the system is obtained in this way, the
incorporation of control is done through a multi-step prediction error
minimization, carried out by a simple linear least-squares regression. The
predictor so obtained is in the form of a linear controlled dynamical system
and can be readily applied within the Koopman model predictive control
framework of [11] to control nonlinear dynamical systems using linear model
predictive control tools. The method is entirely data-driven and based purely
on convex optimization, with no reliance on neural networks or other non-convex
machine learning tools. The novel eigenfunction construction method is also
analyzed theoretically, proving rigorously that the family of eigenfunctions
obtained is rich enough to span the space of all continuous functions. In
addition, the method is extended to construct generalized eigenfunctions that
also give rise Koopman invariant subspaces and hence can be used for linear
prediction. Detailed numerical examples demonstrate the approach, both for
prediction and feedback control
An Emergent Space for Distributed Data with Hidden Internal Order through Manifold Learning
Manifold-learning techniques are routinely used in mining complex
spatiotemporal data to extract useful, parsimonious data
representations/parametrizations; these are, in turn, useful in nonlinear model
identification tasks. We focus here on the case of time series data that can
ultimately be modelled as a spatially distributed system (e.g. a partial
differential equation, PDE), but where we do not know the space in which this
PDE should be formulated. Hence, even the spatial coordinates for the
distributed system themselves need to be identified - to emerge from - the data
mining process. We will first validate this emergent space reconstruction for
time series sampled without space labels in known PDEs; this brings up the
issue of observability of physical space from temporal observation data, and
the transition from spatially resolved to lumped (order-parameter-based)
representations by tuning the scale of the data mining kernels. We will then
present actual emergent space discovery illustrations. Our illustrative
examples include chimera states (states of coexisting coherent and incoherent
dynamics), and chaotic as well as quasiperiodic spatiotemporal dynamics,
arising in partial differential equations and/or in heterogeneous networks. We
also discuss how data-driven spatial coordinates can be extracted in ways
invariant to the nature of the measuring instrument. Such gauge-invariant data
mining can go beyond the fusion of heterogeneous observations of the same
system, to the possible matching of apparently different systems