2,565 research outputs found

    Signatures of criticality arise in simple neural population models with correlations

    Full text link
    Large-scale recordings of neuronal activity make it possible to gain insights into the collective activity of neural ensembles. It has been hypothesized that neural populations might be optimized to operate at a 'thermodynamic critical point', and that this property has implications for information processing. Support for this notion has come from a series of studies which identified statistical signatures of criticality in the ensemble activity of retinal ganglion cells. What are the underlying mechanisms that give rise to these observations? Here we show that signatures of criticality arise even in simple feed-forward models of retinal population activity. In particular, they occur whenever neural population data exhibits correlations, and is randomly sub-sampled during data analysis. These results show that signatures of criticality are not necessarily indicative of an optimized coding strategy, and challenge the utility of analysis approaches based on equilibrium thermodynamics for understanding partially observed biological systems.Comment: 36 pages, LaTeX; added journal reference on page 1, added link to code repositor

    Artificial Neural Network Application In Environmental Engineering.

    Get PDF
    The objective of this thesis research is to apply two artificial neural network (ANN) methods, back-propagation neural network (BPN) and radial basis function generalized regression neural network (RBFGRNN) in two environmental engineering case studies to explore their ability to modeling the complex environmental engineering systems. The traditional environmental engineering systems modeling are frequently using the physical-based modeling methods

    Power-Law Inter-Spike Interval Distributions Infer a Conditional Maximization of Entropy in Cortical Neurons

    Get PDF
    The brain is considered to use a relatively small amount of energy for its efficient information processing. Under a severe restriction on the energy consumption, the maximization of mutual information (MMI), which is adequate for designing artificial processing machines, may not suit for the brain. The MMI attempts to send information as accurate as possible and this usually requires a sufficient energy supply for establishing clearly discretized communication bands. Here, we derive an alternative hypothesis for neural code from the neuronal activities recorded juxtacellularly in the sensorimotor cortex of behaving rats. Our hypothesis states that in vivo cortical neurons maximize the entropy of neuronal firing under two constraints, one limiting the energy consumption (as assumed previously) and one restricting the uncertainty in output spike sequences at given firing rate. Thus, the conditional maximization of firing-rate entropy (CMFE) solves a tradeoff between the energy cost and noise in neuronal response. In short, the CMFE sends a rich variety of information through broader communication bands (i.e., widely distributed firing rates) at the cost of accuracy. We demonstrate that the CMFE is reflected in the long-tailed, typically power law, distributions of inter-spike intervals obtained for the majority of recorded neurons. In other words, the power-law tails are more consistent with the CMFE rather than the MMI. Thus, we propose the mathematical principle by which cortical neurons may represent information about synaptic input into their output spike trains
    • …
    corecore