125 research outputs found

    Power-Law Inter-Spike Interval Distributions Infer a Conditional Maximization of Entropy in Cortical Neurons

    Get PDF
    The brain is considered to use a relatively small amount of energy for its efficient information processing. Under a severe restriction on the energy consumption, the maximization of mutual information (MMI), which is adequate for designing artificial processing machines, may not suit for the brain. The MMI attempts to send information as accurate as possible and this usually requires a sufficient energy supply for establishing clearly discretized communication bands. Here, we derive an alternative hypothesis for neural code from the neuronal activities recorded juxtacellularly in the sensorimotor cortex of behaving rats. Our hypothesis states that in vivo cortical neurons maximize the entropy of neuronal firing under two constraints, one limiting the energy consumption (as assumed previously) and one restricting the uncertainty in output spike sequences at given firing rate. Thus, the conditional maximization of firing-rate entropy (CMFE) solves a tradeoff between the energy cost and noise in neuronal response. In short, the CMFE sends a rich variety of information through broader communication bands (i.e., widely distributed firing rates) at the cost of accuracy. We demonstrate that the CMFE is reflected in the long-tailed, typically power law, distributions of inter-spike intervals obtained for the majority of recorded neurons. In other words, the power-law tails are more consistent with the CMFE rather than the MMI. Thus, we propose the mathematical principle by which cortical neurons may represent information about synaptic input into their output spike trains

    Universal Organization of Resting Brain Activity at the Thermodynamic Critical Point

    Get PDF
    Thermodynamic criticality describes emergent phenomena in a wide variety of complex systems. In the mammalian brain, the complex dynamics that spontaneously emerge from neuronal interactions have been characterized as neuronal avalanches, a form of critical branching dynamics. Here, we show that neuronal avalanches also reflect that the brain dynamics are organized close to a thermodynamic critical point. We recorded spontaneous cortical activity in monkeys and humans at rest using high-density intracranial microelectrode arrays and magnetoencephalography, respectively. By numerically changing a control parameter equivalent to thermodynamic temperature, we observed typical critical behavior in cortical activities near the actual physiological condition, including the phase transition of an order parameter, as well as the divergence of susceptibility and specific heat. Finite-size scaling of these quantities allowed us to derive robust critical exponents highly consistent across monkey and humans that uncover a distinct, yet universal organization of brain dynamics

    Bayesian inference in neural circuits and synapses

    Get PDF
    Bayesian inference describes how to reason optimally under uncertainty. As the brain faces considerable uncertainty, it may be possible to understand aspects of neural computation using Bayesian inference. In this thesis, I address several questions within this broad theme. First, I show that con dence reports may, in some circumstances be Bayes optimal, by taking a \doubly Bayesian" strategy: computing the Bayesian model evidence for several di erent models of participant's behaviour, one of which is itself Bayesian. Second, I address a related question concerning features of the probability distributions realised by neural activity. In particular, it has been show that neural activity obeys Zipf's law, as do many other statistical distributions. We show the emergence of Zipf's law is in fact unsurprising, as it emerges from the existence of an underlying latent variable: ring rate. Third, I show that synaptic plasticity can be formulated as a Bayesian inference problem, and I give neural evidence in support of this proposition, based on the hypothesis that neurons sample from the resulting posterior distributions. Fourth, I consider how oscillatory excitatory-inhibitory circuits might perform inference by relating these circuits to a highly effective method for probabilistic inference: Hamiltonian Monte Carlo

    Testing the Power-Law Hypothesis of the Inter-Conflict Interval

    Full text link
    The severity of war, measured by battle deaths, follows a power-law distribution. Here, we demonstrate that power law also holds in the temporal aspects of interstate conflicts. A critical quantity is the inter-conflict interval (ICI), the interval between the end of a conflict in a dyad and the start of the subsequent conflict in the same dyad. Using elaborate statistical tests, we confirmed that the ICI samples compiled from the history of interstate conflicts from 1816 to 2014 followed a power-law distribution. We propose an information-theoretic model to account for the power-law properties of ICIs. The model predicts that a series of ICIs in each dyad is independently generated from an identical power-law distribution. This was confirmed by statistical examination of the autocorrelation of the ICI series. Our findings help us understand the nature of wars between normal states, the significance of which has increased since the Russian invasion of Ukraine in 2022
    • …
    corecore