5,607 research outputs found
Inference of stochastic nonlinear oscillators with applications to physiological problems
A new method of inferencing of coupled stochastic nonlinear oscillators is
described. The technique does not require extensive global optimization,
provides optimal compensation for noise-induced errors and is robust in a broad
range of dynamical models. We illustrate the main ideas of the technique by
inferencing a model of five globally and locally coupled noisy oscillators.
Specific modifications of the technique for inferencing hidden degrees of
freedom of coupled nonlinear oscillators is discussed in the context of
physiological applications.Comment: 11 pages, 10 figures, 2 tables Fluctuations and Noise 2004, SPIE
Conference, 25-28 May 2004 Gran Hotel Costa Meloneras Maspalomas, Gran
Canaria, Spai
A Metropolis-Hastings algorithm for extracting periodic gravitational wave signals from laser interferometric detector data
The Markov chain Monte Carlo methods offer practical procedures for detecting
signals characterized by a large number of parameters and under conditions of
low signal-to-noise ratio. We present a Metropolis-Hastings algorithm capable
of inferring the spin and orientation parameters of a neutron star from its
periodic gravitational wave signature seen by laser interferometric detector
Statistical Models of Reconstructed Phase Spaces for Signal Classification
This paper introduces a novel approach to the analysis and classification of time series signals using statistical models of reconstructed phase spaces. With sufficient dimension, such reconstructed phase spaces are, with probability one, guaranteed to be topologically equivalent to the state dynamics of the generating system, and, therefore, may contain information that is absent in analysis and classification methods rooted in linear assumptions. Parametric and nonparametric distributions are introduced as statistical representations over the multidimensional reconstructed phase space, with classification accomplished through methods such as Bayes maximum likelihood and artificial neural networks (ANNs). The technique is demonstrated on heart arrhythmia classification and speech recognition. This new approach is shown to be a viable and effective alternative to traditional signal classification approaches, particularly for signals with strong nonlinear characteristics
A Tutorial on Time-Evolving Dynamical Bayesian Inference
In view of the current availability and variety of measured data, there is an
increasing demand for powerful signal processing tools that can cope
successfully with the associated problems that often arise when data are being
analysed. In practice many of the data-generating systems are not only
time-variable, but also influenced by neighbouring systems and subject to
random fluctuations (noise) from their environments. To encompass problems of
this kind, we present a tutorial about the dynamical Bayesian inference of
time-evolving coupled systems in the presence of noise. It includes the
necessary theoretical description and the algorithms for its implementation.
For general programming purposes, a pseudocode description is also given.
Examples based on coupled phase and limit-cycle oscillators illustrate the
salient features of phase dynamics inference. State domain inference is
illustrated with an example of coupled chaotic oscillators. The applicability
of the latter example to secure communications based on the modulation of
coupling functions is outlined. MatLab codes for implementation of the method,
as well as for the explicit examples, accompany the tutorial.Comment: Matlab codes can be found on http://py-biomedical.lancaster.ac.uk
Functional Structure and Approximation in Econometrics (book front matter)
This is the front matter from the book, William A. Barnett and Jane Binner (eds.), Functional Structure and Approximation in Econometrics, published in 2004 by Elsevier in its Contributions to Economic Analysis monograph series. The front matter includes the Table of Contents, Volume Introduction, and Section Introductions by Barnett and Binner and the Preface by W. Erwin Diewert. The volume contains a unified collection and discussion of W. A. Barnett's most important published papers on applied and theoretical econometric modelling.consumer demand, production, flexible functional form, functional structure, asymptotics, nonlinearity, systemwide models
Data based identification and prediction of nonlinear and complex dynamical systems
We thank Dr. R. Yang (formerly at ASU), Dr. R.-Q. Su (formerly at ASU), and Mr. Zhesi Shen for their contributions to a number of original papers on which this Review is partly based. This work was supported by ARO under Grant No. W911NF-14-1-0504. W.-X. Wang was also supported by NSFC under Grants No. 61573064 and No. 61074116, as well as by the Fundamental Research Funds for the Central Universities, Beijing Nova Programme.Peer reviewedPostprin
Neural Information Processing: between synchrony and chaos
The brain is characterized by performing many different processing tasks ranging from elaborate processes such as pattern recognition, memory or decision-making to more simple functionalities such as linear filtering in image processing. Understanding the mechanisms by which the brain is able to produce such a different range of cortical operations remains a fundamental problem in neuroscience. Some recent empirical and theoretical results support the notion that the brain is naturally poised between ordered and chaotic states. As the largest number of metastable states exists at a point near the transition, the brain therefore has access to a larger repertoire of behaviours. Consequently, it is of high interest to know which type of processing can be associated with both ordered and disordered states. Here we show an explanation of which processes are related to chaotic and synchronized states based on the study of in-silico implementation of biologically plausible neural systems. The measurements obtained reveal that synchronized cells (that can be understood as ordered states of the brain) are related to non-linear computations, while uncorrelated neural ensembles are excellent information transmission systems that are able to implement linear transformations (as the realization of convolution products) and to parallelize neural processes. From these results we propose a plausible meaning for Hebbian and non-Hebbian learning rules as those biophysical mechanisms by which the brain creates ordered or chaotic ensembles depending on the desired functionality. The measurements that we obtain from the hardware implementation of different neural systems endorse the fact that the brain is working with two different states, ordered and chaotic, with complementary functionalities that imply non-linear processing (synchronized states) and information transmission and convolution (chaotic states)
- …