814 research outputs found

    Markov Blankets in the Brain

    Get PDF
    Recent characterisations of self-organising systems depend upon the presence of a Markov blanket: a statistical boundary that mediates the interactions between what is inside of and outside of a system. We leverage this idea to provide an analysis of partitions in neuronal systems. This is applicable to brain architectures at multiple scales, enabling partitions into single neurons, brain regions, and brain-wide networks. This treatment is based upon the canonical micro-circuitry used in empirical studies of effective connectivity, so as to speak directly to practical applications. This depends upon the dynamic coupling between functional units, whose form recapitulates that of a Markov blanket at each level. The nuance afforded by partitioning neural systems in this way highlights certain limitations of modular perspectives of brain function that only consider a single level of description.Comment: 25 pages, 5 figures, 1 table, Glossar

    Dynamics and network structure in neuroimaging data

    Get PDF

    Bits from Biology for Computational Intelligence

    Get PDF
    Computational intelligence is broadly defined as biologically-inspired computing. Usually, inspiration is drawn from neural systems. This article shows how to analyze neural systems using information theory to obtain constraints that help identify the algorithms run by such systems and the information they represent. Algorithms and representations identified information-theoretically may then guide the design of biologically inspired computing systems (BICS). The material covered includes the necessary introduction to information theory and the estimation of information theoretic quantities from neural data. We then show how to analyze the information encoded in a system about its environment, and also discuss recent methodological developments on the question of how much information each agent carries about the environment either uniquely, or redundantly or synergistically together with others. Last, we introduce the framework of local information dynamics, where information processing is decomposed into component processes of information storage, transfer, and modification -- locally in space and time. We close by discussing example applications of these measures to neural data and other complex systems

    Annotated Bibliography: Anticipation

    Get PDF

    Sixty years of cybernetics: cybernetics still alive

    Get PDF
    summary:This informal essay, written on the occasion of 60th anniversary of Wienerian cybernetics, presents a series of themes and ideas that has emerged during last several decades and which have direct or indirect relationships to the principal concepts of cybernetics. Moreover, they share with original cybernetics the same transdisciplinary character

    Neuronal oscillations, information dynamics, and behaviour: an evolutionary robotics study

    Get PDF
    Oscillatory neural activity is closely related to cognition and behaviour, with synchronisation mechanisms playing a key role in the integration and functional organization of different cortical areas. Nevertheless, its informational content and relationship with behaviour - and hence cognition - are still to be fully understood. This thesis is concerned with better understanding the role of neuronal oscillations and information dynamics towards the generation of embodied cognitive behaviours and with investigating the efficacy of such systems as practical robot controllers. To this end, we develop a novel model based on the Kuramoto model of coupled phase oscillators and perform three minimally cognitive evolutionary robotics experiments. The analyses focus both on a behavioural level description, investigating the robotā€™s trajectories, and on a mechanism level description, exploring the variablesā€™ dynamics and the information transfer properties within and between the agentā€™s body and the environment. The first experiment demonstrates that in an active categorical perception task under normal and inverted vision, networks with a definite, but not too strong, propensity for synchronisation are more able to reconfigure, to organise themselves functionally, and to adapt to different behavioural conditions. The second experiment relates assembly constitution and phase reorganisation dynamics to performance in supervised and unsupervised learning tasks. We demonstrate that assembly dynamics facilitate the evolutionary process, can account for varying degrees of stimuli modulation of the sensorimotor interactions, and can contribute to solving different tasks leaving aside other plasticity mechanisms. The third experiment explores an associative learning task considering a more realistic connectivity pattern between neurons. We demonstrate that networks with travelling waves as a default solution perform poorly compared to networks that are normally synchronised in the absence of stimuli. Overall, this thesis shows that neural synchronisation dynamics, when suitably flexible and reconfigurable, produce an asymmetric flow of information and can generate minimally cognitive embodied behaviours

    A view of Neural Networks as dynamical systems

    Full text link
    We consider neural networks from the point of view of dynamical systems theory. In this spirit we review recent results dealing with the following questions, adressed in the context of specific models. 1. Characterizing the collective dynamics; 2. Statistical analysis of spikes trains; 3. Interplay between dynamics and network structure; 4. Effects of synaptic plasticity.Comment: Review paper, 51 pages, 10 figures. submitte
    • ā€¦
    corecore