37,649 research outputs found
A Unifying review of linear gaussian models
Factor analysis, principal component analysis, mixtures of gaussian clusters, vector quantization, Kalman filter models, and hidden Markov models can all be unified as variations of unsupervised learning under a single basic generative model. This is achieved by collecting together disparate observations and derivations made by many previous authors and introducing a new way of linking discrete and continuous state models using a simple nonlinearity. Through the use of other nonlinearities, we show how independent component analysis is also a variation of the same basic generative model.We show that factor analysis and mixtures of gaussians can be implemented in autoencoder neural networks and learned using squared error plus the same regularization term. We introduce a new model for static data, known as sensible principal component analysis, as well as a novel concept of spatially adaptive observation noise. We also review some of the literature involving global and local mixtures of the basic models and provide pseudocode for inference and learning for all the basic models
Structure Learning in Coupled Dynamical Systems and Dynamic Causal Modelling
Identifying a coupled dynamical system out of many plausible candidates, each
of which could serve as the underlying generator of some observed measurements,
is a profoundly ill posed problem that commonly arises when modelling real
world phenomena. In this review, we detail a set of statistical procedures for
inferring the structure of nonlinear coupled dynamical systems (structure
learning), which has proved useful in neuroscience research. A key focus here
is the comparison of competing models of (ie, hypotheses about) network
architectures and implicit coupling functions in terms of their Bayesian model
evidence. These methods are collectively referred to as dynamical casual
modelling (DCM). We focus on a relatively new approach that is proving
remarkably useful; namely, Bayesian model reduction (BMR), which enables rapid
evaluation and comparison of models that differ in their network architecture.
We illustrate the usefulness of these techniques through modelling
neurovascular coupling (cellular pathways linking neuronal and vascular
systems), whose function is an active focus of research in neurobiology and the
imaging of coupled neuronal systems
Revealing networks from dynamics: an introduction
What can we learn from the collective dynamics of a complex network about its
interaction topology? Taking the perspective from nonlinear dynamics, we
briefly review recent progress on how to infer structural connectivity (direct
interactions) from accessing the dynamics of the units. Potential applications
range from interaction networks in physics, to chemical and metabolic
reactions, protein and gene regulatory networks as well as neural circuits in
biology and electric power grids or wireless sensor networks in engineering.
Moreover, we briefly mention some standard ways of inferring effective or
functional connectivity.Comment: Topical review, 48 pages, 7 figure
Complexity Characterization in a Probabilistic Approach to Dynamical Systems Through Information Geometry and Inductive Inference
Information geometric techniques and inductive inference methods hold great
promise for solving computational problems of interest in classical and quantum
physics, especially with regard to complexity characterization of dynamical
systems in terms of their probabilistic description on curved statistical
manifolds. In this article, we investigate the possibility of describing the
macroscopic behavior of complex systems in terms of the underlying statistical
structure of their microscopic degrees of freedom by use of statistical
inductive inference and information geometry. We review the Maximum Relative
Entropy (MrE) formalism and the theoretical structure of the information
geometrodynamical approach to chaos (IGAC) on statistical manifolds. Special
focus is devoted to the description of the roles played by the sectional
curvature, the Jacobi field intensity and the information geometrodynamical
entropy (IGE). These quantities serve as powerful information geometric
complexity measures of information-constrained dynamics associated with
arbitrary chaotic and regular systems defined on the statistical manifold.
Finally, the application of such information geometric techniques to several
theoretical models are presented.Comment: 29 page
- β¦