56 research outputs found

    Metastability: an emergent phenomenon in networks of spiking neurons

    No full text
    It is widely recognised that different brain areas perform different specialised functions. However, it remains an open question how different brain areas coordinate with each other and give rise to global brain states and high-level cognition. Recent theories suggest that transient periods of synchronisation and desynchronisation provide a mechanism for dynamically integrating and forming coalitions of functionally related neural areas, and that at these times conditions are optimal for information transfer. Empirical evidence from human resting state networks has shown a tendency for multiple brain areas to synchronise for short amounts of time, and for different synchronous groups to appear at different times. In dynamical systems terms, this behaviour resembles metastability — an intrinsically driven movement between transient, attractor-like states. However, it remains an open question what the underlying mechanism is that gives rise to these observed phenomena. The thesis first establishes that oscillating neural populations display a great amount of spectral complexity, with several rhythms temporally coexisting in the same and different structures. The thesis next explores inter-band frequency modulation between neural oscillators. The results show that oscillations in different neural populations, and in different frequency bands, modulate each other so as to change frequency. Further to this, the interaction of these fluctuating frequencies in the network as a whole is able to drive different neural populations towards episodes of synchrony. Finally, a symbiotic relationship between metastability and underlying network structure is elucidated, in which the presence of plasticity, responding to the interactions between different neural areas, will naturally form modular small-world networks that in turn further promote metastability. This seemingly inevitable drive towards metastabilty in simulation suggests that it should also be present in biological brains. The conclusion drawn is that these key network characteristics, and the metastable dynamics they promote, facilitate versatile exploration, integration, and communication between functionally related neural areas, and thereby support sophisticated cognitive processing in the brain.Open Acces

    Criticality and its effect on other cortical phenomena

    Get PDF
    Neuronal avalanches are a cortical phenomenon defined by bursts of neuronal firing encapsulated by periods of quiescence. It has been found both in vivo and in vitro that neuronal avalanches follow a power law distribution which is indicative of the system being within or near a critical state. A system is critical if it is poised between order and disorder with the possibility of minor event leading to a large chain reaction. This is also observed by the system exhibiting a diverging correlation length between its components as it approaches the critical point. It has been shown that neuronal criticality is a scale-free phenomenon observed throughout the entire system as well as within each module of the system. At a small scale, neuronal networks produce avalanches which conform to power law-like distributions. At a larger scale, we observe that these systems consist of modules exhibiting long-range temporal correlations identifiable via Detrended Fluctuation Analysis (DFA). This phenomenon is hypothesised to affect network behaviour with regards to information processing, information storage, computational power, and stability - The Criticality Hypothesis. This thesis attempts to better understand critical neuronal networks and how criticality may link with other neuronal phenomena. This work begins by investigating the interplay of network connectivity, synaptic plasticity, and criticality. Using different network construction algorithms, the thesis demonstrates that Hebbian learning and Spike Timing Dependent Plasticity (STDP) robustly drive small networks towards a critical state. Moreover the thesis shows that, while the initial distribution of synaptic weights plays a significant role in attaining criticality, the network's topology at the modular level has little or no impact. Using an expanded eight-module oscillatory spiking neural network the thesis then shows the link between the different critical markers we use when attempting to observe critical behaviour at different scales. The findings demonstrate that modules exhibiting power law-like behaviour also demonstrate long-range temporal correlations throughout the system. Furthermore, we show that when modules no longer exhibit power law-like behaviour we find that they become uncorrelated or noisy. This shows a correlation between power law-like behaviour observed within each module and the long-range temporal correlations between the modules. The thesis concludes by demonstrating how criticality may be linked with other related phenomena, namely metastability and dynamical complexity. Metastability is a global property of neuronal populations that migrate between attractor-like states. Metastability can be quantified by the variance of synchrony, a measure that has been hypothesised to capture the varying influence neuronal populations have over one another and the system as a whole. The thesis shows a correlation between critical behaviour and metastability where the latter is most reliably maximised only when the former is near the critical state. This conclusion is expected as metastability, similarly to criticality reflects the interplay between the integrating and segregating tendencies of the system components. Agreeing with previous findings this suggests that metastable dynamics may be another marker of critical behaviour. A neural system is said to exhibit dynamical complexity if a balance of integrated and segregated activity occurs within the system. A common attribute of critical systems is a balance between excitation and inhibition. The final part of the thesis attempts to understand how criticality may be linked with dynamical complexity. This work shows a possible connection between these phenomena providing a foundation for further analysis. The thesis concludes with a discussion of the significant role criticality plays in determining the behaviour of neuronal networks.Open Acces

    Dynamical principles in neuroscience

    Full text link
    Dynamical modeling of neural systems and brain functions has a history of success over the last half century. This includes, for example, the explanation and prediction of some features of neural rhythmic behaviors. Many interesting dynamical models of learning and memory based on physiological experiments have been suggested over the last two decades. Dynamical models even of consciousness now exist. Usually these models and results are based on traditional approaches and paradigms of nonlinear dynamics including dynamical chaos. Neural systems are, however, an unusual subject for nonlinear dynamics for several reasons: (i) Even the simplest neural network, with only a few neurons and synaptic connections, has an enormous number of variables and control parameters. These make neural systems adaptive and flexible, and are critical to their biological function. (ii) In contrast to traditional physical systems described by well-known basic principles, first principles governing the dynamics of neural systems are unknown. (iii) Many different neural systems exhibit similar dynamics despite having different architectures and different levels of complexity. (iv) The network architecture and connection strengths are usually not known in detail and therefore the dynamical analysis must, in some sense, be probabilistic. (v) Since nervous systems are able to organize behavior based on sensory inputs, the dynamical modeling of these systems has to explain the transformation of temporal information into combinatorial or combinatorial-temporal codes, and vice versa, for memory and recognition. In this review these problems are discussed in the context of addressing the stimulating questions: What can neuroscience learn from nonlinear dynamics, and what can nonlinear dynamics learn from neuroscience?This work was supported by NSF Grant No. NSF/EIA-0130708, and Grant No. PHY 0414174; NIH Grant No. 1 R01 NS50945 and Grant No. NS40110; MEC BFI2003-07276, and Fundación BBVA

    A Survey on Reservoir Computing and its Interdisciplinary Applications Beyond Traditional Machine Learning

    Full text link
    Reservoir computing (RC), first applied to temporal signal processing, is a recurrent neural network in which neurons are randomly connected. Once initialized, the connection strengths remain unchanged. Such a simple structure turns RC into a non-linear dynamical system that maps low-dimensional inputs into a high-dimensional space. The model's rich dynamics, linear separability, and memory capacity then enable a simple linear readout to generate adequate responses for various applications. RC spans areas far beyond machine learning, since it has been shown that the complex dynamics can be realized in various physical hardware implementations and biological devices. This yields greater flexibility and shorter computation time. Moreover, the neuronal responses triggered by the model's dynamics shed light on understanding brain mechanisms that also exploit similar dynamical processes. While the literature on RC is vast and fragmented, here we conduct a unified review of RC's recent developments from machine learning to physics, biology, and neuroscience. We first review the early RC models, and then survey the state-of-the-art models and their applications. We further introduce studies on modeling the brain's mechanisms by RC. Finally, we offer new perspectives on RC development, including reservoir design, coding frameworks unification, physical RC implementations, and interaction between RC, cognitive neuroscience and evolution.Comment: 51 pages, 19 figures, IEEE Acces

    25th annual computational neuroscience meeting: CNS-2016

    Get PDF
    The same neuron may play different functional roles in the neural circuits to which it belongs. For example, neurons in the Tritonia pedal ganglia may participate in variable phases of the swim motor rhythms [1]. While such neuronal functional variability is likely to play a major role the delivery of the functionality of neural systems, it is difficult to study it in most nervous systems. We work on the pyloric rhythm network of the crustacean stomatogastric ganglion (STG) [2]. Typically network models of the STG treat neurons of the same functional type as a single model neuron (e.g. PD neurons), assuming the same conductance parameters for these neurons and implying their synchronous firing [3, 4]. However, simultaneous recording of PD neurons shows differences between the timings of spikes of these neurons. This may indicate functional variability of these neurons. Here we modelled separately the two PD neurons of the STG in a multi-neuron model of the pyloric network. Our neuron models comply with known correlations between conductance parameters of ionic currents. Our results reproduce the experimental finding of increasing spike time distance between spikes originating from the two model PD neurons during their synchronised burst phase. The PD neuron with the larger calcium conductance generates its spikes before the other PD neuron. Larger potassium conductance values in the follower neuron imply longer delays between spikes, see Fig. 17.Neuromodulators change the conductance parameters of neurons and maintain the ratios of these parameters [5]. Our results show that such changes may shift the individual contribution of two PD neurons to the PD-phase of the pyloric rhythm altering their functionality within this rhythm. Our work paves the way towards an accessible experimental and computational framework for the analysis of the mechanisms and impact of functional variability of neurons within the neural circuits to which they belong

    25th Annual Computational Neuroscience Meeting: CNS-2016

    Get PDF
    Abstracts of the 25th Annual Computational Neuroscience Meeting: CNS-2016 Seogwipo City, Jeju-do, South Korea. 2–7 July 201
    corecore