280 research outputs found
Neuronal Synchronization Can Control the Energy Efficiency of Inter-Spike Interval Coding
The role of synchronous firing in sensory coding and cognition remains
controversial. While studies, focusing on its mechanistic consequences in
attentional tasks, suggest that synchronization dynamically boosts sensory
processing, others failed to find significant synchronization levels in such
tasks. We attempt to understand both lines of evidence within a coherent
theoretical framework. We conceptualize synchronization as an independent
control parameter to study how the postsynaptic neuron transmits the average
firing activity of a presynaptic population, in the presence of
synchronization. We apply the Berger-Levy theory of energy efficient
information transmission to interpret simulations of a Hodgkin-Huxley-type
postsynaptic neuron model, where we varied the firing rate and synchronization
level in the presynaptic population independently. We find that for a fixed
presynaptic firing rate the simulated postsynaptic interspike interval
distribution depends on the synchronization level and is well-described by a
generalized extreme value distribution. For synchronization levels of 15% to
50%, we find that the optimal distribution of presynaptic firing rate,
maximizing the mutual information per unit cost, is maximized at ~30%
synchronization level. These results suggest that the statistics and energy
efficiency of neuronal communication channels, through which the input rate is
communicated, can be dynamically adapted by the synchronization level.Comment: 47 pages, 14 figures, 2 Table
Dynamics and spike trains statistics in conductance-based Integrate-and-Fire neural networks with chemical and electric synapses
We investigate the effect of electric synapses (gap junctions) on collective
neuronal dynamics and spike statistics in a conductance-based
Integrate-and-Fire neural network, driven by a Brownian noise, where
conductances depend upon spike history. We compute explicitly the time
evolution operator and show that, given the spike-history of the network and
the membrane potentials at a given time, the further dynamical evolution can be
written in a closed form. We show that spike train statistics is described by a
Gibbs distribution whose potential can be approximated with an explicit
formula, when the noise is weak. This potential form encompasses existing
models for spike trains statistics analysis such as maximum entropy models or
Generalized Linear Models (GLM). We also discuss the different types of
correlations: those induced by a shared stimulus and those induced by neurons
interactions.Comment: 42 pages, 1 figure, submitte
Synchrony in Neuronal Communications: An Energy Efficient Scheme
We are interested in understanding the neural correlates of attentional
processes using first principles. Here we apply a recently developed first
principles approach that uses transmitted information in bits per joule to
quantify the energy efficiency of information transmission for an
inter-spike-interval (ISI) code that can be modulated by means of the synchrony
in the presynaptic population. We simulate a single compartment
conductance-based model neuron driven by excitatory and inhibitory spikes from
a presynaptic population, where the rate and synchrony in the presynaptic
excitatory population may vary independently from the average rate. We find
that for a fixed input rate, the ISI distribution of the post synaptic neuron
depends on the level of synchrony and is well-described by a Gamma distribution
for synchrony levels less than 50%. For levels of synchrony between 15% and 50%
(restricted for technical reasons), we compute the optimum input distribution
that maximizes the mutual information per unit energy. This optimum
distribution shows that an increased level of synchrony, as it has been
reported experimentally in attention-demanding conditions, reduces the mode of
the input distribution and the excitability threshold of post synaptic neuron.
This facilitates a more energy efficient neuronal communication.Comment: 6 pages, 5 figures, Accepted for publication to IWCIT 201
Proceedings of the 2011 New York Workshop on Computer, Earth and Space Science
The purpose of the New York Workshop on Computer, Earth and Space Sciences is
to bring together the New York area's finest Astronomers, Statisticians,
Computer Scientists, Space and Earth Scientists to explore potential synergies
between their respective fields. The 2011 edition (CESS2011) was a great
success, and we would like to thank all of the presenters and participants for
attending. This year was also special as it included authors from the upcoming
book titled "Advances in Machine Learning and Data Mining for Astronomy". Over
two days, the latest advanced techniques used to analyze the vast amounts of
information now available for the understanding of our universe and our planet
were presented. These proceedings attempt to provide a small window into what
the current state of research is in this vast interdisciplinary field and we'd
like to thank the speakers who spent the time to contribute to this volume.Comment: Author lists modified. 82 pages. Workshop Proceedings from CESS 2011
in New York City, Goddard Institute for Space Studie
Data-based analysis of extreme events: inference, numerics and applications
The concept of extreme events describes the above average behavior of a process, for instance, heat waves in climate or weather research, earthquakes in geology and financial crashes in economics. It is significant to study the behavior of extremes, in order to reduce their negative impacts. Key objectives include the identification of the appropriate mathematical/statistical model, description of the underlying dependence structure in the multivariate or the spatial case, and the investigation of the most relevant external factors. Extreme value analysis (EVA), based on Extreme Value Theory, provides the necessary statistical tools. Assuming that all relevant covariates are known and observed, EVA often deploys statistical regression analysis to study the changes in the model parameters. Modeling of the dependence structure implies a priori assumptions such as Gaussian, locally stationary or isotropic behavior. Based on EVA and advanced time-series analysis methodology, this thesis introduces a semiparametric, nonstationary and non- homogenous framework for statistical regression analysis of spatio-temporal extremes. The involved regression analysis accounts explicitly for systematically missing covariates; their influence was reduced to an additive nonstationary offset. The nonstationarity was resolved by the Finite Element Time Series Analysis Methodology (FEM). FEM approximates the underlying nonstationarity by a set of locally stationary models and a nonstationary hidden switching process with bounded variation (BV). The resulting FEM-BV- EVA approach goes beyond a priori assumptions of standard methods based, for instance, on Bayesian statistics, Hidden Markov Models or Local Kernel Smoothing. The multivariate/spatial extension of FEM-BV-EVA describes the underlying spatial variability by the model parameters, referring to hierarchical modeling. The spatio-temporal behavior of the model parameters was approximated by locally stationary models and a spatial nonstationary switching process. Further, it was shown that the resulting spatial FEM-BV-EVA formulation is consistent with the max-stability postulate and describes the underlying dependence structure in a nonparametric way. The proposed FEM-BV-EVA methodology was integrated into the existent FEM MATLAB toolbox. The FEM-BV-EVA framework is computationally efficient as it deploys gradient free MCMC based optimization methods and numerical solvers for constrained, large, structured quadratic and linear problems. In order to demonstrate its performance, FEM-BV-EVA was applied to various test-cases and real-data and compared to standard methods. It was shown that parametric approaches lead to biased results if significant covariates are unresolved. Comparison to nonparametric methods based on smoothing regression revealed their weakness, the locality property and the inability to resolve discontinuous functions. Spatial FEM-BV-EVA was applied to study the dynamics of extreme precipitation over Switzerland. The analysis identified among others three major spatially dependent regions
Modern applications of machine learning in quantum sciences
In these Lecture Notes, we provide a comprehensive introduction to the most recent advances in the application of machine learning methods in quantum sciences. We cover the use of deep learning and kernel methods in supervised, unsupervised, and reinforcement learning algorithms for phase classification, representation of many-body quantum states, quantum feedback control, and quantum circuits optimization. Moreover, we introduce and discuss more specialized topics such as differentiable programming, generative models, statistical approach to machine learning, and quantum machine learning
Modern applications of machine learning in quantum sciences
In these Lecture Notes, we provide a comprehensive introduction to the most
recent advances in the application of machine learning methods in quantum
sciences. We cover the use of deep learning and kernel methods in supervised,
unsupervised, and reinforcement learning algorithms for phase classification,
representation of many-body quantum states, quantum feedback control, and
quantum circuits optimization. Moreover, we introduce and discuss more
specialized topics such as differentiable programming, generative models,
statistical approach to machine learning, and quantum machine learning.Comment: 268 pages, 87 figures. Comments and feedback are very welcome.
Figures and tex files are available at
https://github.com/Shmoo137/Lecture-Note
- …