6,531 research outputs found

    Mathematical control of complex systems

    Get PDF
    Copyright © 2013 ZidongWang et al.This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited

    A Unifying review of linear gaussian models

    Get PDF
    Factor analysis, principal component analysis, mixtures of gaussian clusters, vector quantization, Kalman filter models, and hidden Markov models can all be unified as variations of unsupervised learning under a single basic generative model. This is achieved by collecting together disparate observations and derivations made by many previous authors and introducing a new way of linking discrete and continuous state models using a simple nonlinearity. Through the use of other nonlinearities, we show how independent component analysis is also a variation of the same basic generative model.We show that factor analysis and mixtures of gaussians can be implemented in autoencoder neural networks and learned using squared error plus the same regularization term. We introduce a new model for static data, known as sensible principal component analysis, as well as a novel concept of spatially adaptive observation noise. We also review some of the literature involving global and local mixtures of the basic models and provide pseudocode for inference and learning for all the basic models

    Kalman Filter and its Economic Applications

    Get PDF
    This paper is an eclectic study of the uses of the Kalman filter in existing econometric literature. An effort is made to introduce the various extensions to the linear filter first developed by Kalman(1960) through examples of their uses in economics. The basic filter is first derived and then some applications are reviewed.Kalman Filter; Time-varying Parameters; Stochastic Volatility; Markov Switching

    Maximum-likelihood estimation of delta-domain model parameters from noisy output signals

    Get PDF
    Fast sampling is desirable to describe signal transmission through wide-bandwidth systems. The delta-operator provides an ideal discrete-time modeling description for such fast-sampled systems. However, the estimation of delta-domain model parameters is usually biased by directly applying the delta-transformations to a sampled signal corrupted by additive measurement noise. This problem is solved here by expectation-maximization, where the delta-transformations of the true signal are estimated and then used to obtain the model parameters. The method is demonstrated on a numerical example to improve on the accuracy of using a shift operator approach when the sample rate is fast

    The Neural Particle Filter

    Get PDF
    The robust estimation of dynamically changing features, such as the position of prey, is one of the hallmarks of perception. On an abstract, algorithmic level, nonlinear Bayesian filtering, i.e. the estimation of temporally changing signals based on the history of observations, provides a mathematical framework for dynamic perception in real time. Since the general, nonlinear filtering problem is analytically intractable, particle filters are considered among the most powerful approaches to approximating the solution numerically. Yet, these algorithms prevalently rely on importance weights, and thus it remains an unresolved question how the brain could implement such an inference strategy with a neuronal population. Here, we propose the Neural Particle Filter (NPF), a weight-less particle filter that can be interpreted as the neuronal dynamics of a recurrently connected neural network that receives feed-forward input from sensory neurons and represents the posterior probability distribution in terms of samples. Specifically, this algorithm bridges the gap between the computational task of online state estimation and an implementation that allows networks of neurons in the brain to perform nonlinear Bayesian filtering. The model captures not only the properties of temporal and multisensory integration according to Bayesian statistics, but also allows online learning with a maximum likelihood approach. With an example from multisensory integration, we demonstrate that the numerical performance of the model is adequate to account for both filtering and identification problems. Due to the weightless approach, our algorithm alleviates the 'curse of dimensionality' and thus outperforms conventional, weighted particle filters in higher dimensions for a limited number of particles
    • …
    corecore