139 research outputs found

    Information processing in biological complex systems: a view to bacterial and neural complexity

    Get PDF
    This thesis is a study of information processing of biological complex systems seen from the perspective of dynamical complexity (the degree of statistical independence of a system as a whole with respect to its components due to its causal structure). In particular, we investigate the influence of signaling functions in cell-to-cell communication in bacterial and neural systems. For each case, we determine the spatial and causal dependencies in the system dynamics from an information-theoretic point of view and we relate it with their physiological capabilities. The main research content is presented into three main chapters. First, we study a previous theoretical work on synchronization, multi-stability, and clustering of a population of coupled synthetic genetic oscillators via quorum sensing. We provide an extensive numerical analysis of the spatio-temporal interactions, and determine conditions in which the causal structure of the system leads to high dynamical complexity in terms of associated metrics. Our results indicate that this complexity is maximally receptive at transitions between dynamical regimes, and maximized for transient multi-cluster oscillations associated with chaotic behaviour. Next, we introduce a model of a neuron-astrocyte network with bidirectional coupling using glutamate-induced calcium signaling. This study is focused on the impact of the astrocyte-mediated potentiation on synaptic transmission. Our findings suggest that the information generated by the joint activity of the population of neurons is irreducible to its independent contribution due to the role of astrocytes. We relate these results with the shared information modulated by the spike synchronization imposed by the bidirectional feedback between neurons and astrocytes. It is shown that the dynamical complexity is maximized when there is a balance between the spike correlation and spontaneous spiking activity. Finally, the previous observations on neuron-glial signaling are extended to a large-scale system with community structure. Here we use a multi-scale approach to account for spatiotemporal features of astrocytic signaling coupled with clusters of neurons. We investigate the interplay of astrocytes and spiking-time-dependent-plasticity at local and global scales in the emergence of complexity and neuronal synchronization. We demonstrate the utility of astrocytes and learning in improving the encoding of external stimuli as well as its ability to favour the integration of information at synaptic timescales to exhibit a high intrinsic causal structure at the system level. Our proposed approach and observations point to potential effects of the astrocytes for sustaining more complex information processing in the neural circuitry

    Computing with Synchrony

    Get PDF

    Continuous attractor working memory and provenance of channel models

    Get PDF
    The brain is a complex biological system composed of a multitude of microscopic processes, which together give rise to computational abilities observed in everyday behavior. Neuronal modeling, consisting of models of single neurons and neuronal networks at varying levels of biological detail, can synthesize the gaps currently hard to constrain in experiments and provide mechanistic explanations of how these computations might arise. In this thesis, I present two parallel lines of research on neuronal modeling, situated at varying levels of biological detail. First, I assess the provenance of voltage-gated ion channel models in an integrative meta-analysis that investigates a backlog of nearly 50 years of published research. To cope with the ever-increasing volume of research produced in the field of neuroscience, we need to develop methods for the systematic assessment and comparison of published work. As we demonstrate, neuronal models offer the intriguing possibility of performing automated quantitative analyses across studies, by standardized simulated experiments. We developed protocols for the quantitative comparison of voltage-gated ion channels, and applied them to a large body of published models, allowing us to assess the variety and temporal development of different models for the same ion channels over the time scale of years of research. Beyond a systematic classification of the existing body of research made available in an online platform, we show that our approach extends to large-scale comparisons of ion channel models to experimental data, thereby facilitating field-wide standardization of experimentally-constrained modeling. Second, I investigate neuronal models of working memory (WM). How can cortical networks bridge the short time scales of their microscopic components, which operate on the order of milliseconds, to the behaviorally relevant time scales of seconds observed in WM experiments? I consider here a candidate model: continuous attractor networks. These can implement WM for a continuum of possible spatial locations over several seconds and have been proposed for the organization of prefrontal cortical networks. I first present a novel method for the efficient prediction of the network-wide steady states from the underlying microscopic network properties. The method can be applied to predict and tune the "bump" shapes of continuous attractors implemented in networks of spiking neuron models connected by nonlinear synapses, which we demonstrate for saturating synapses involving NMDA receptors. In a second part, I investigate the computational role of short-term synaptic plasticity as a synaptic nonlinearity. Continuous attractor models are sensitive to the inevitable variability of biological neurons: variable neuronal firing and heterogeneous networks decrease the time that memories are accurately retained, eventually leading to a loss of memory functionality on behaviorally relevant time scales. In theory and simulations, I show that short-term plasticity can control the time scale of memory retention, with facilitation and depression playing antagonistic roles in controlling the drift and diffusion of locations in memory. Finally, we place quantitative constraints on the combination of synaptic and network parameters under which continuous attractors networks can implement reliable WM in cortical settings

    Characterization and Compensation of Network-Level Anomalies in Mixed-Signal Neuromorphic Modeling Platforms

    Full text link
    Advancing the size and complexity of neural network models leads to an ever increasing demand for computational resources for their simulation. Neuromorphic devices offer a number of advantages over conventional computing architectures, such as high emulation speed or low power consumption, but this usually comes at the price of reduced configurability and precision. In this article, we investigate the consequences of several such factors that are common to neuromorphic devices, more specifically limited hardware resources, limited parameter configurability and parameter variations. Our final aim is to provide an array of methods for coping with such inevitable distortion mechanisms. As a platform for testing our proposed strategies, we use an executable system specification (ESS) of the BrainScaleS neuromorphic system, which has been designed as a universal emulation back-end for neuroscientific modeling. We address the most essential limitations of this device in detail and study their effects on three prototypical benchmark network models within a well-defined, systematic workflow. For each network model, we start by defining quantifiable functionality measures by which we then assess the effects of typical hardware-specific distortion mechanisms, both in idealized software simulations and on the ESS. For those effects that cause unacceptable deviations from the original network dynamics, we suggest generic compensation mechanisms and demonstrate their effectiveness. Both the suggested workflow and the investigated compensation mechanisms are largely back-end independent and do not require additional hardware configurability beyond the one required to emulate the benchmark networks in the first place. We hereby provide a generic methodological environment for configurable neuromorphic devices that are targeted at emulating large-scale, functional neural networks

    Biologically plausible attractor networks

    Get PDF
    Attractor networks have shownmuch promise as a neural network architecture that can describe many aspects of brain function. Much of the field of study around these networks has coalesced around pioneering work done by John Hoprield, and therefore many approaches have been strongly linked to the field of statistical physics. In this thesis I use existing theoretical and statistical notions of attractor networks, and introduce several biologically inspired extensions to an attractor network for which a mean-field solution has been previously derived. This attractor network is a computational neuroscience model that accounts for decision-making in the situation of two competing stimuli. By basing our simulation studies on such a network, we are able to study situations where mean- field solutions have been derived, and use these as the starting case, which we then extend with large scale integrate-and-fire attractor network simulations. The simulations are large enough to provide evidence that the results apply to networks of the size found in the brain. One factor that has been highlighted by previous research to be very important to brain function is that of noise. Spiking-related noise is seen to be a factor that influences processes such as decision-making, signal detection, short-term memory, and memory recall even with the quite large networks found in the cerebral cortex, and this thesis aims to measure the effects of noise on biologically plausible attractor networks. Our results are obtained using a spiking neural network made up of integrate-and-fire neurons, and we focus our results on the stochastic transition that this network undergoes. In this thesis we examine two such processes that are biologically relevant, but for which no mean-field solutions yet exist: graded firing rates, and diluted connectivity. Representations in the cortex are often graded, and we find that noise in these networks may be larger than with binary representations. In further investigations it was shown that diluted connectivity reduces the effects of noise in the situation where the number of synapses onto each neuron is held constant. In this thesis we also use the same attractor network framework to investigate the Communication through Coherence hypothesis. The Communication through Coherence hypothesis states that synchronous oscillations, especially in the gamma range, can facilitate communication between neural systems. It is shown that information transfer from one network to a second network occurs for a much lower strength of synaptic coupling between the networks than is required to produce coherence. Thus, information transmission can occur before any coherence is produced. This indicates that coherence is not needed for information transmission between coupled networks. This raises a major question about the Communication through Coherence hypothesis. Overall, the results provide substantial contributions towards understanding operation of attractor neuronal networks in the brain

    27th Annual Computational Neuroscience Meeting (CNS*2018): Part One

    Get PDF
    • …
    corecore