805 research outputs found
A State Space Approach for Piecewise-Linear Recurrent Neural Networks for Reconstructing Nonlinear Dynamics from Neural Measurements
The computational properties of neural systems are often thought to be
implemented in terms of their network dynamics. Hence, recovering the system
dynamics from experimentally observed neuronal time series, like multiple
single-unit (MSU) recordings or neuroimaging data, is an important step toward
understanding its computations. Ideally, one would not only seek a state space
representation of the dynamics, but would wish to have access to its governing
equations for in-depth analysis. Recurrent neural networks (RNNs) are a
computationally powerful and dynamically universal formal framework which has
been extensively studied from both the computational and the dynamical systems
perspective. Here we develop a semi-analytical maximum-likelihood estimation
scheme for piecewise-linear RNNs (PLRNNs) within the statistical framework of
state space models, which accounts for noise in both the underlying latent
dynamics and the observation process. The Expectation-Maximization algorithm is
used to infer the latent state distribution, through a global Laplace
approximation, and the PLRNN parameters iteratively. After validating the
procedure on toy examples, the approach is applied to MSU recordings from the
rodent anterior cingulate cortex obtained during performance of a classical
working memory task, delayed alternation. A model with 5 states turned out to
be sufficient to capture the essential computational dynamics underlying task
performance, including stimulus-selective delay activity. The estimated models
were rarely multi-stable, but rather were tuned to exhibit slow dynamics in the
vicinity of a bifurcation point. In summary, the present work advances a
semi-analytical (thus reasonably fast) maximum-likelihood estimation framework
for PLRNNs that may enable to recover the relevant dynamics underlying observed
neuronal time series, and directly link them to computational properties
The complete mitochondrial genome of Yarrowia lipolytica
We here report the complete nucleotide sequence of the 47.9 kb mitochondrial (mt) genome from the obligate aerobic yeast Yarrowia lipolytica. It encodes, all on the same strand, seven subunits of NADH: ubiquinone oxidoreductase (ND1-6, ND4L), apocytochrome b (COB), three subunits of cytochrome oxidase (COX1, 2, 3), three subunits of ATP synthetase (ATP6, 8 and 9), small and large ribosomal RNAs and an incomplete set of tRNAs. The Y. lipolytica mt genome is very similar to the Hansenula wingei mt genome, as judged from blocks of conserved gene order and from sequence homology. The extra DNA in the Y. lipolytica mt genome consists of 17 group 1 introns and stretches of A+Trich sequence, interspersed with potentially transposable GC clusters. The usual mould mt genetic code is used. Interestingly, there is no tRNA able to read CGN (arginine) codons. CGN codons could not be found in exonic open reading frames, whereas they do occur in intronic open reading frames. However, several of the intronic open reading frames have accumulated mutations and must be regarded as pseudogenes. We propose that this may have been triggered by the presence of untranslatable CGN codons. This sequence is available under EMBL Accession No. AJ307410
Detecting Multiple Change Points Using Adaptive Regression Splines With Application to Neural Recordings
Time series, as frequently the case in neuroscience, are rarely stationary, but often exhibit abrupt changes due to attractor transitions or bifurcations in the dynamical systems producing them. A plethora of methods for detecting such change points in time series statistics have been developed over the years, in addition to test criteria to evaluate their significance. Issues to consider when developing change point analysis methods include computational demands, difficulties arising from either limited amount of data or a large number of covariates, and arriving at statistical tests with sufficient power to detect as many changes as contained in potentially high-dimensional time series. Here, a general method called Paired Adaptive Regressors for Cumulative Sum is developed for detecting multiple change points in the mean of multivariate time series. The method's advantages over alternative approaches are demonstrated through a series of simulation experiments. This is followed by a real data application to neural recordings from rat medial prefrontal cortex during learning. Finally, the method's flexibility to incorporate useful features from state-of-the-art change point detection techniques is discussed, along with potential drawbacks and suggestions to remedy them
Psychiatric Illnesses as Disorders of Network Dynamics
This review provides a dynamical systems perspective on psychiatric symptoms and disease, and discusses its potential implications for diagnosis, prognosis, and treatment. After a brief introduction into the theory of dynamical systems, we will focus on the idea that cognitive and emotional functions are implemented in terms of dynamical systems phenomena in the brain, a common assumption in theoretical and computational neuroscience. Specific computational models, anchored in biophysics, for generating different types of network dynamics, and with a relation to psychiatric symptoms, will be briefly reviewed, as well as methodological approaches for reconstructing the system dynamics from observed time series (like fMRI or EEG recordings). We then attempt to outline how psychiatric phenomena, associated with schizophrenia, depression, PTSD, ADHD, phantom pain, and others, could be understood in dynamical systems terms. Most importantly, we will try to convey that the dynamical systems level may provide a central, hub-like level of convergence which unifies and links multiple biophysical and behavioral phenomena, in the sense that diverse biophysical changes can give rise to the same dynamical phenomena and, vice versa, similar changes in dynamics may yield different behavioral symptoms depending on the brain area where these changes manifest. If this assessment is correct, it may have profound implications for the diagnosis, prognosis, and treatment of psychiatric conditions, as it puts the focus on dynamics. We therefore argue that consideration of dynamics should play an important role in the choice and target of interventions
A biophysical model of decision making in an antisaccade task through variable climbing activity
We present a biophysical model of saccade initiation based on
competitive integration of planned and reactive cortical saccade decision signals
in the intermediate layer of the superior colliculus. In the model, the variable
slopes of the climbing activities of the input cortical decision signals are
produced from variability in the conductances of Na+, K+, Ca2+ activated K+,
NMDA and GABA currents. These cortical decision signals are integrated in
the activities of buildup neurons in the intermediate layer of the superior
colliculus, whose activities grow nonlinearly towards a preset criterion level.
When the level is crossed, a movement is initiated. The resultant model
reproduces the unimodal distributions of saccade reaction times (SRTs) for
correct antisaccades and erroneous prosaccades as well as the variability of
SRTs (ranging from 80ms to 600ms) and the overall 25% of erroneous
prosaccade responses in a large sample of 2006 young men performing an
antisaccade task
- ā¦