14,636 research outputs found
Classification-based prediction of effective connectivity between timeseries with a realistic cortical network model
Effective connectivity measures the pattern of causal interactions between brain regions. Traditionally, these patterns of causality are inferred from brain recordings using either non-parametric, i.e., model-free, or parametric, i.e., model-based, approaches. The latter approaches, when based on biophysically plausible models, have the advantage that they may facilitate the interpretation of causality in terms of underlying neural mechanisms. Recent biophysically plausible neural network models of recurrent microcircuits have shown the ability to reproduce well the characteristics of real neural activity and can be applied to model interacting cortical circuits. Unfortunately, however, it is challenging to invert these models in order to estimate effective connectivity from observed data. Here, we propose to use a classification-based method to approximate the result of such complex model inversion. The classifier predicts the pattern of causal interactions given a multivariate timeseries as input. The classifier is trained on a large number of pairs of multivariate timeseries and the respective pattern of causal interactions, which are generated by simulation from the neural network model. In simulated experiments, we show that the proposed method is much more accurate in detecting the causal structure of timeseries than current best practice methods. Additionally, we present further results to characterize the validity of the neural network model and the ability of the classifier to adapt to the generative model of the data
Inferring Synaptic Structure in presence of Neural Interaction Time Scales
Biological networks display a variety of activity patterns reflecting a web
of interactions that is complex both in space and time. Yet inference methods
have mainly focused on reconstructing, from the network's activity, the spatial
structure, by assuming equilibrium conditions or, more recently, a
probabilistic dynamics with a single arbitrary time-step. Here we show that,
under this latter assumption, the inference procedure fails to reconstruct the
synaptic matrix of a network of integrate-and-fire neurons when the chosen time
scale of interaction does not closely match the synaptic delay or when no
single time scale for the interaction can be identified; such failure,
moreover, exposes a distinctive bias of the inference method that can lead to
infer as inhibitory the excitatory synapses with interaction time scales longer
than the model's time-step. We therefore introduce a new two-step method, that
first infers through cross-correlation profiles the delay-structure of the
network and then reconstructs the synaptic matrix, and successfully test it on
networks with different topologies and in different activity regimes. Although
step one is able to accurately recover the delay-structure of the network, thus
getting rid of any \textit{a priori} guess about the time scales of the
interaction, the inference method introduces nonetheless an arbitrary time
scale, the time-bin used to binarize the spike trains. We therefore
analytically and numerically study how the choice of affects the inference
in our network model, finding that the relationship between the inferred
couplings and the real synaptic efficacies, albeit being quadratic in both
cases, depends critically on for the excitatory synapses only, whilst
being basically independent of it for the inhibitory ones
Synaptic mechanisms of interference in working memory
Information from preceding trials of cognitive tasks can bias performance in
the current trial, a phenomenon referred to as interference. Subjects
performing visual working memory tasks exhibit interference in their
trial-to-trial response correlations: the recalled target location in the
current trial is biased in the direction of the target presented on the
previous trial. We present modeling work that (a) develops a probabilistic
inference model of this history-dependent bias, and (b) links our probabilistic
model to computations of a recurrent network wherein short-term facilitation
accounts for the dynamics of the observed bias. Network connectivity is
reshaped dynamically during each trial, providing a mechanism for generating
predictions from prior trial observations. Applying timescale separation
methods, we can obtain a low-dimensional description of the trial-to-trial bias
based on the history of target locations. The model has response statistics
whose mean is centered at the true target location across many trials, typical
of such visual working memory tasks. Furthermore, we demonstrate task protocols
for which the plastic model performs better than a model with static
connectivity: repetitively presented targets are better retained in working
memory than targets drawn from uncorrelated sequences.Comment: 28 pages, 7 figure
Pairwise Ising model analysis of human cortical neuron recordings
During wakefulness and deep sleep brain states, cortical neural networks show
a different behavior, with the second characterized by transients of high
network activity. To investigate their impact on neuronal behavior, we apply a
pairwise Ising model analysis by inferring the maximum entropy model that
reproduces single and pairwise moments of the neuron's spiking activity. In
this work we first review the inference algorithm introduced in Ferrari,Phys.
Rev. E (2016). We then succeed in applying the algorithm to infer the model
from a large ensemble of neurons recorded by multi-electrode array in human
temporal cortex. We compare the Ising model performance in capturing the
statistical properties of the network activity during wakefulness and deep
sleep. For the latter, the pairwise model misses relevant transients of high
network activity, suggesting that additional constraints are necessary to
accurately model the data.Comment: 8 pages, 3 figures, Geometric Science of Information 2017 conferenc
Dynamics and Performance of Susceptibility Propagation on Synthetic Data
We study the performance and convergence properties of the Susceptibility
Propagation (SusP) algorithm for solving the Inverse Ising problem. We first
study how the temperature parameter (T) in a Sherrington-Kirkpatrick model
generating the data influences the performance and convergence of the
algorithm. We find that at the high temperature regime (T>4), the algorithm
performs well and its quality is only limited by the quality of the supplied
data. In the low temperature regime (T<4), we find that the algorithm typically
does not converge, yielding diverging values for the couplings. However, we
show that by stopping the algorithm at the right time before divergence becomes
serious, good reconstruction can be achieved down to T~2. We then show that
dense connectivity, loopiness of the connectivity, and high absolute
magnetization all have deteriorating effects on the performance of the
algorithm. When absolute magnetization is high, we show that other methods can
be work better than SusP. Finally, we show that for neural data with high
absolute magnetization, SusP performs less well than TAP inversion.Comment: 9 pages, 7 figure
Markovian Dynamics on Complex Reaction Networks
Complex networks, comprised of individual elements that interact with each
other through reaction channels, are ubiquitous across many scientific and
engineering disciplines. Examples include biochemical, pharmacokinetic,
epidemiological, ecological, social, neural, and multi-agent networks. A common
approach to modeling such networks is by a master equation that governs the
dynamic evolution of the joint probability mass function of the underling
population process and naturally leads to Markovian dynamics for such process.
Due however to the nonlinear nature of most reactions, the computation and
analysis of the resulting stochastic population dynamics is a difficult task.
This review article provides a coherent and comprehensive coverage of recently
developed approaches and methods to tackle this problem. After reviewing a
general framework for modeling Markovian reaction networks and giving specific
examples, the authors present numerical and computational techniques capable of
evaluating or approximating the solution of the master equation, discuss a
recently developed approach for studying the stationary behavior of Markovian
reaction networks using a potential energy landscape perspective, and provide
an introduction to the emerging theory of thermodynamic analysis of such
networks. Three representative problems of opinion formation, transcription
regulation, and neural network dynamics are used as illustrative examples.Comment: 52 pages, 11 figures, for freely available MATLAB software, see
http://www.cis.jhu.edu/~goutsias/CSS%20lab/software.htm
- …