6,575 research outputs found

    Diffusion Maps Kalman Filter for a Class of Systems with Gradient Flows

    Full text link
    In this paper, we propose a non-parametric method for state estimation of high-dimensional nonlinear stochastic dynamical systems, which evolve according to gradient flows with isotropic diffusion. We combine diffusion maps, a manifold learning technique, with a linear Kalman filter and with concepts from Koopman operator theory. More concretely, using diffusion maps, we construct data-driven virtual state coordinates, which linearize the system model. Based on these coordinates, we devise a data-driven framework for state estimation using the Kalman filter. We demonstrate the strengths of our method with respect to both parametric and non-parametric algorithms in three tracking problems. In particular, applying the approach to actual recordings of hippocampal neural activity in rodents directly yields a representation of the position of the animals. We show that the proposed method outperforms competing non-parametric algorithms in the examined stochastic problem formulations. Additionally, we obtain results comparable to classical parametric algorithms, which, in contrast to our method, are equipped with model knowledge.Comment: 15 pages, 12 figures, submitted to IEEE TS

    Robust Independent Component Analysis via Minimum Divergence Estimation

    Full text link
    Independent component analysis (ICA) has been shown to be useful in many applications. However, most ICA methods are sensitive to data contamination and outliers. In this article we introduce a general minimum U-divergence framework for ICA, which covers some standard ICA methods as special cases. Within the U-family we further focus on the gamma-divergence due to its desirable property of super robustness, which gives the proposed method gamma-ICA. Statistical properties and technical conditions for the consistency of gamma-ICA are rigorously studied. In the limiting case, it leads to a necessary and sufficient condition for the consistency of MLE-ICA. This necessary and sufficient condition is weaker than the condition known in the literature. Since the parameter of interest in ICA is an orthogonal matrix, a geometrical algorithm based on gradient flows on special orthogonal group is introduced to implement gamma-ICA. Furthermore, a data-driven selection for the gamma value, which is critical to the achievement of gamma-ICA, is developed. The performance, especially the robustness, of gamma-ICA in comparison with standard ICA methods is demonstrated through experimental studies using simulated data and image data.Comment: 7 figure

    Deep Learning in a Generalized HJM-type Framework Through Arbitrage-Free Regularization

    Full text link
    We introduce a regularization approach to arbitrage-free factor-model selection. The considered model selection problem seeks to learn the closest arbitrage-free HJM-type model to any prespecified factor-model. An asymptotic solution to this, a priori computationally intractable, problem is represented as the limit of a 1-parameter family of optimizers to computationally tractable model selection tasks. Each of these simplified model-selection tasks seeks to learn the most similar model, to the prescribed factor-model, subject to a penalty detecting when the reference measure is a local martingale-measure for the entire underlying financial market. A simple expression for the penalty terms is obtained in the bond market withing the affine-term structure setting, and it is used to formulate a deep-learning approach to arbitrage-free affine term-structure modelling. Numerical implementations are also performed to evaluate the performance in the bond market.Comment: 23 Pages + Reference
    • …
    corecore