150 research outputs found

    Evaluating performance of neural codes in neural communication networks

    Get PDF
    Information needs to be appropriately encoded to be reliably transmitted over a physical media. Similarly, neurons have their own codes to convey information in the brain. Even though it is well-know that neurons exchange information using a pool of several protocols of spatial-temporal encodings, the suitability of each code and their performance as a function of the network parameters and external stimuli is still one of the great mysteries in Neuroscience. This paper sheds light into this problem considering small networks of chemically and electrically coupled Hindmarsh-Rose spiking neurons. We focus on the mathematical fundamental aspects of a class of temporal and firing-rate codes that result from the neurons' action-potentials and phases, and quantify their performance by measuring the Mutual Information Rate, aka the rate of information exchange. A particularly interesting result regards the performance of the codes with respect to the way neurons are connected. We show that pairs of neurons that have the largest rate of information exchange using the interspike interval and firing-rate codes are not adjacent in the network, whereas the spiking-time and phase codes promote large exchange of information rate from adjacent neurons. This result, if possible to extend to larger neural networks, would suggest that small microcircuits of fully connected neurons, also known as cliques, would preferably exchange information using temporal codes (spiking-time and phase codes), whereas on the macroscopic scale, where typically there will be pairs of neurons that are not directly connected due to the brain's sparsity, the most efficient codes would be the firing rate and interspike interval codes, with the latter being closely related to the firing rate code

    Evaluating performance of neural codes in model neural communication networks

    Get PDF
    Information needs to be appropriately encoded to be reliably transmitted over a physical media. Similarly, neurons have their own codes to convey information in the brain. Even though it is well-know that neurons exchange information using a pool of several protocols of spatial-temporal encodings, the suitability of each code and their performance as a function of the network parameters and external stimuli is still one of the great mysteries in Neuroscience. This paper sheds light into this problem considering small networks of chemically and electrically coupled Hindmarsh-Rose spiking neurons. We focus on the mathematical fundamental aspects of a class of temporal and firing-rate codes that result from the neurons' action-potentials and phases, and quantify their performance by measuring the Mutual Information Rate, aka the rate of information exchange. A particularly interesting result regards the performance of the codes with respect to the way neurons are connected. We show that pairs of neurons that have the largest rate of information exchange using the interspike interval and firing-rate codes are not adjacent in the network, whereas the spiking-time and phase codes promote large exchange of information rate from adjacent neurons. This result, if possible to extend to larger neural networks, would suggest that small microcircuits of fully connected neurons, also known as cliques, would preferably exchange information using temporal codes (spiking-time and phase codes), whereas on the macroscopic scale, where typically there will be pairs of neurons that are not directly connected due to the brain's sparsity, the most efficient codes would be the firing rate and interspike interval codes, with the latter being closely related to the firing rate code

    Editorial: Advancing our understanding of structure and function in the brain: Developing novel approaches for network inference and emergent phenomena

    Get PDF
    The work published in this Research Topic advances our understanding on complex systems at large and on the inner workings of the brain. It sheds light on structural and functional brain properties, and on emergent and synchronization phenomena. There are analytical and computational approaches as well as the development of mathematical-modeling methods pertaining to complex systems and network neuroscience. We expect that the reader will find them helpful in understanding complex systems in general, but also, helpful in understanding the inner workings of the brain and its states

    Singular Monopoles from Cheshire Bows

    Get PDF
    Singular monopoles are nonabelian monopoles with prescribed Dirac-type singularities. All of them are delivered by the Nahm's construction. In practice, however, the effectiveness of the latter is limited to the cases of one or two singularities. We present an alternative construction of singular monopoles formulated in terms of Cheshire bows. To illustrate the advantages of our bow construction we obtain an explicit expression for one U(2) gauge group monopole with any given number of singularities of Dirac type.Comment: LaTeX, 34 pages, 8 figure

    Production and transfer of energy and information in Hamiltonian systems

    Get PDF
    We present novel results that relate energy and information transfer with sensitivity to initial conditions in chaotic multi-dimensional Hamiltonian systems. We show the relation among Kolmogorov-Sinai entropy, Lyapunov exponents, and upper bounds for the Mutual Information Rate calculated in the Hamiltonian phase space and on bi-dimensional subspaces. Our main result is that the net amount of transfer from kinetic to potential energy per unit of time is a power-law of the upper bound for the Mutual Information Rate between kinetic and potential energies, and also a power-law of the Kolmogorov-Sinai entropy. Therefore, transfer of energy is related with both transfer and production of information. However, the power-law nature of this relation means that a small increment of energy transferred leads to a relatively much larger increase of the information exchanged. Then, we propose an ?experimental? implementation of a 1-dimensional communication channel based on a Hamiltonian system, and calculate the actual rate with which information is exchanged between the first and last particle of the channel. Finally, a relation between our results and important quantities of thermodynamics is presented

    Maintaining extensivity in evolutionary multiplex networks

    Get PDF
    In this paper, we explore the role of network topology on maintaining the extensive property of entropy. We study analytically and numerically how the topology contributes to maintaining extensivity of entropy in multiplex networks, i.e. networks of subnetworks (layers), by means of the sum of the positive Lyapunov exponents, HKS, a quantity related to entropy. We show that extensivity relies not only on the interplay between the coupling strengths of the dynamics associated to the intra (short-range) and inter (long-range) interactions, but also on the sum of the intra-degrees of the nodes of the layers. For the analytically treated networks of size N, among several other results, we show that if the sum of the intra-degrees (and the sum of inter-degrees) scales as N?+1, ? > 0, extensivity can be maintained if the intra-coupling (and the inter-coupling) strength scales as N??, when evolution is driven by the maximisation of HKS. We then verify our analytical results by performing numerical simulations in multiplex networks formed by electrically and chemically coupled neurons

    Bistable Firing Pattern in a Neural Network Model

    Get PDF
    Excessively high, neural synchronization has been associated with epileptic seizures, one of the most common brain diseases worldwide. A better understanding of neural synchronization mechanisms can thus help control or even treat epilepsy. In this paper, we study neural synchronization in a random network where nodes are neurons with excitatory and inhibitory synapses, and neural activity for each node is provided by the adaptive exponential integrate-and-fire model. In this framework, we verify that the decrease in the influence of inhibition can generate synchronization originating from a pattern of desynchronized spikes. The transition from desynchronous spikes to synchronous bursts of activity, induced by varying the synaptic coupling, emerges in a hysteresis loop due to bistability where abnormal (excessively high synchronous) regimes exist. We verify that, for parameters in the bistability regime, a square current pulse can trigger excessively high (abnormal) synchronization, a process that can reproduce features of epileptic seizures. Then, we show that it is possible to suppress such abnormal synchronization by applying a small-amplitude external current on > 10% of the neurons in the network. Our results demonstrate that external electrical stimulation not only can trigger synchronous behavior, but more importantly, it can be used as a means to reduce abnormal synchronization and thus, control or treat effectively epileptic seizures.Peer Reviewe

    Evaluating performance of neural codes in model neural communication networks

    Get PDF
    Information needs to be appropriately encoded to be reliably transmitted over physical media. Similarly, neurons have their own codes to convey information in the brain. Even though it is well-known that neurons exchange information using a pool of several protocols of spatio-temporal encodings, the suitability of each code and their performance as a function of network parameters and external stimuli is still one of the great mysteries in neuroscience. This paper sheds light on this by modeling small-size networks of chemically and electrically coupled Hindmarsh-Rose spiking neurons. We focus on a class of temporal and firing-rate codes that result from neurons' membrane-potentials and phases, and quantify numerically their performance estimating the Mutual Information Rate, aka the rate of information exchange. Our results suggest that the firing-rate and interspike-intervals codes are more robust to additive Gaussian white noise. In a network of four interconnected neurons and in the absence of such noise, pairs of neurons that have the largest rate of information exchange using the interspike-intervals and firing-rate codes are not adjacent in the network, whereas spike-timings and phase codes (temporal) promote large rate of information exchange for adjacent neurons. If that result would have been possible to extend to larger neural networks, it would suggest that small microcircuits would preferably exchange information using temporal codes (spike-timings and phase codes), whereas on the macroscopic scale, where there would be typically pairs of neurons not directly connected due to the brain's sparsity, firing-rate and interspike-intervals codes would be the most efficient codes
    corecore