184 research outputs found

    Dynamic mode decomposition in vector-valued reproducing kernel Hilbert spaces for extracting dynamical structure among observables

    Full text link
    Understanding nonlinear dynamical systems (NLDSs) is challenging in a variety of engineering and scientific fields. Dynamic mode decomposition (DMD), which is a numerical algorithm for the spectral analysis of Koopman operators, has been attracting attention as a way of obtaining global modal descriptions of NLDSs without requiring explicit prior knowledge. However, since existing DMD algorithms are in principle formulated based on the concatenation of scalar observables, it is not directly applicable to data with dependent structures among observables, which take, for example, the form of a sequence of graphs. In this paper, we formulate Koopman spectral analysis for NLDSs with structures among observables and propose an estimation algorithm for this problem. This method can extract and visualize the underlying low-dimensional global dynamics of NLDSs with structures among observables from data, which can be useful in understanding the underlying dynamics of such NLDSs. To this end, we first formulate the problem of estimating spectra of the Koopman operator defined in vector-valued reproducing kernel Hilbert spaces, and then develop an estimation procedure for this problem by reformulating tensor-based DMD. As a special case of our method, we propose the method named as Graph DMD, which is a numerical algorithm for Koopman spectral analysis of graph dynamical systems, using a sequence of adjacency matrices. We investigate the empirical performance of our method by using synthetic and real-world data.Comment: 34 pages with 4 figures, Published in Neural Networks, 201

    Eigendecompositions of Transfer Operators in Reproducing Kernel Hilbert Spaces

    Get PDF
    Transfer operators such as the Perron--Frobenius or Koopman operator play an important role in the global analysis of complex dynamical systems. The eigenfunctions of these operators can be used to detect metastable sets, to project the dynamics onto the dominant slow processes, or to separate superimposed signals. We extend transfer operator theory to reproducing kernel Hilbert spaces and show that these operators are related to Hilbert space representations of conditional distributions, known as conditional mean embeddings in the machine learning community. Moreover, numerical methods to compute empirical estimates of these embeddings are akin to data-driven methods for the approximation of transfer operators such as extended dynamic mode decomposition and its variants. One main benefit of the presented kernel-based approaches is that these methods can be applied to any domain where a similarity measure given by a kernel is available. We illustrate the results with the aid of guiding examples and highlight potential applications in molecular dynamics as well as video and text data analysis

    Kernel methods for detecting coherent structures in dynamical data

    Full text link
    We illustrate relationships between classical kernel-based dimensionality reduction techniques and eigendecompositions of empirical estimates of reproducing kernel Hilbert space (RKHS) operators associated with dynamical systems. In particular, we show that kernel canonical correlation analysis (CCA) can be interpreted in terms of kernel transfer operators and that it can be obtained by optimizing the variational approach for Markov processes (VAMP) score. As a result, we show that coherent sets of particle trajectories can be computed by kernel CCA. We demonstrate the efficiency of this approach with several examples, namely the well-known Bickley jet, ocean drifter data, and a molecular dynamics problem with a time-dependent potential. Finally, we propose a straightforward generalization of dynamic mode decomposition (DMD) called coherent mode decomposition (CMD). Our results provide a generic machine learning approach to the computation of coherent sets with an objective score that can be used for cross-validation and the comparison of different methods

    Online Estimation of the Koopman Operator Using Fourier Features

    Full text link
    Transfer operators offer linear representations and global, physically meaningful features of nonlinear dynamical systems. Discovering transfer operators, such as the Koopman operator, require careful crafted dictionaries of observables, acting on states of the dynamical system. This is ad hoc and requires the full dataset for evaluation. In this paper, we offer an optimization scheme to allow joint learning of the observables and Koopman operator with online data. Our results show we are able to reconstruct the evolution and represent the global features of complex dynamical systems.Comment: Accepted to 5th L4DC Conference. Proceedings of The 5th Annual Learning for Dynamics and Control Conference, PMLR 211:1271-1283, 2023. 13 pages 6 figure

    Koopman Kernel Regression

    Full text link
    Many machine learning approaches for decision making, such as reinforcement learning, rely on simulators or predictive models to forecast the time-evolution of quantities of interest, e.g., the state of an agent or the reward of a policy. Forecasts of such complex phenomena are commonly described by highly nonlinear dynamical systems, making their use in optimization-based decision-making challenging. Koopman operator theory offers a beneficial paradigm for addressing this problem by characterizing forecasts via linear time-invariant (LTI) ODEs, turning multi-step forecasts into sparse matrix multiplication. Though there exists a variety of learning approaches, they usually lack crucial learning-theoretic guarantees, making the behavior of the obtained models with increasing data and dimensionality unclear. We address the aforementioned by deriving a universal Koopman-invariant reproducing kernel Hilbert space (RKHS) that solely spans transformations into LTI dynamical systems. The resulting Koopman Kernel Regression (KKR) framework enables the use of statistical learning tools from function approximation for novel convergence results and generalization error bounds under weaker assumptions than existing work. Our experiments demonstrate superior forecasting performance compared to Koopman operator and sequential data predictors in RKHS.Comment: Accepted to the thirty-seventh Conference on Neural Information Processing Systems (NeurIPS 2023
    corecore