3,195 research outputs found

    Intrinsic adaptation in autonomous recurrent neural networks

    Full text link
    A massively recurrent neural network responds on one side to input stimuli and is autonomously active, on the other side, in the absence of sensory inputs. Stimuli and information processing depends crucially on the qualia of the autonomous-state dynamics of the ongoing neural activity. This default neural activity may be dynamically structured in time and space, showing regular, synchronized, bursting or chaotic activity patterns. We study the influence of non-synaptic plasticity on the default dynamical state of recurrent neural networks. The non-synaptic adaption considered acts on intrinsic neural parameters, such as the threshold and the gain, and is driven by the optimization of the information entropy. We observe, in the presence of the intrinsic adaptation processes, three distinct and globally attracting dynamical regimes, a regular synchronized, an overall chaotic and an intermittent bursting regime. The intermittent bursting regime is characterized by intervals of regular flows, which are quite insensitive to external stimuli, interseeded by chaotic bursts which respond sensitively to input signals. We discuss these finding in the context of self-organized information processing and critical brain dynamics.Comment: 24 pages, 8 figure

    Applications of Biological Cell Models in Robotics

    Full text link
    In this paper I present some of the most representative biological models applied to robotics. In particular, this work represents a survey of some models inspired, or making use of concepts, by gene regulatory networks (GRNs): these networks describe the complex interactions that affect gene expression and, consequently, cell behaviour

    Cross frequency coupling in next generation inhibitory neural mass models

    Full text link
    Coupling among neural rhythms is one of the most important mechanisms at the basis of cognitive processes in the brain. In this study we consider a neural mass model, rigorously obtained from the microscopic dynamics of an inhibitory spiking network with exponential synapses, able to autonomously generate collective oscillations (COs). These oscillations emerge via a super-critical Hopf bifurcation, and their frequencies are controlled by the synaptic time scale, the synaptic coupling and the excitability of the neural population. Furthermore, we show that two inhibitory populations in a master-slave configuration with different synaptic time scales can display various collective dynamical regimes: namely, damped oscillations towards a stable focus, periodic and quasi-periodic oscillations, and chaos. Finally, when bidirectionally coupled the two inhibitory populations can exhibit different types of theta-gamma cross-frequency couplings (CFCs): namely, phase-phase and phase-amplitude CFC. The coupling between theta and gamma COs is enhanced in presence of a external theta forcing, reminiscent of the type of modulation induced in Hippocampal and Cortex circuits via optogenetic drive.Comment: 14 pages, 10 figure

    Rhythmogenic neuronal networks, pacemakers, and k-cores

    Full text link
    Neuronal networks are controlled by a combination of the dynamics of individual neurons and the connectivity of the network that links them together. We study a minimal model of the preBotzinger complex, a small neuronal network that controls the breathing rhythm of mammals through periodic firing bursts. We show that the properties of a such a randomly connected network of identical excitatory neurons are fundamentally different from those of uniformly connected neuronal networks as described by mean-field theory. We show that (i) the connectivity properties of the networks determines the location of emergent pacemakers that trigger the firing bursts and (ii) that the collective desensitization that terminates the firing bursts is determined again by the network connectivity, through k-core clusters of neurons.Comment: 4+ pages, 4 figures, submitted to Phys. Rev. Let

    Nanophotonic reservoir computing with photonic crystal cavities to generate periodic patterns

    Get PDF
    Reservoir computing (RC) is a technique in machine learning inspired by neural systems. RC has been used successfully to solve complex problems such as signal classification and signal generation. These systems are mainly implemented in software, and thereby they are limited in speed and power efficiency. Several optical and optoelectronic implementations have been demonstrated, in which the system has signals with an amplitude and phase. It is proven that these enrich the dynamics of the system, which is beneficial for the performance. In this paper, we introduce a novel optical architecture based on nanophotonic crystal cavities. This allows us to integrate many neurons on one chip, which, compared with other photonic solutions, closest resembles a classical neural network. Furthermore, the components are passive, which simplifies the design and reduces the power consumption. To assess the performance of this network, we train a photonic network to generate periodic patterns, using an alternative online learning rule called first-order reduced and corrected error. For this, we first train a classical hyperbolic tangent reservoir, but then we vary some of the properties to incorporate typical aspects of a photonics reservoir, such as the use of continuous-time versus discrete-time signals and the use of complex-valued versus real-valued signals. Then, the nanophotonic reservoir is simulated and we explore the role of relevant parameters such as the topology, the phases between the resonators, the number of nodes that are biased and the delay between the resonators. It is important that these parameters are chosen such that no strong self-oscillations occur. Finally, our results show that for a signal generation task a complex-valued, continuous-time nanophotonic reservoir outperforms a classical (i.e., discrete-time, real-valued) leaky hyperbolic tangent reservoir (normalized root-mean-square errors = 0.030 versus NRMSE = 0.127)

    Annotated Bibliography: Anticipation

    Get PDF

    Autonomous Learning by Simple Dynamical Systems with Delayed Feedbacks

    Full text link
    A general scheme for construction of dynamical systems able to learn generation of the desired kinds of dynamics through adjustment of their internal structure is proposed. The scheme involves intrinsic time-delayed feedback to steer the dynamics towards the target performance. As an example, a system of coupled phase oscillators, which can by changing the weights of connections between its elements evolve to a dynamical state with the prescribed (low or high) synchronization level, is considered and investigated

    Generating Coherent Patterns of Activity from Chaotic Neural Networks

    Get PDF
    SummaryNeural circuits display complex activity patterns both spontaneously and when responding to a stimulus or generating a motor output. How are these two forms of activity related? We develop a procedure called FORCE learning for modifying synaptic strengths either external to or within a model neural network to change chaotic spontaneous activity into a wide variety of desired activity patterns. FORCE learning works even though the networks we train are spontaneously chaotic and we leave feedback loops intact and unclamped during learning. Using this approach, we construct networks that produce a wide variety of complex output patterns, input-output transformations that require memory, multiple outputs that can be switched by control inputs, and motor patterns matching human motion capture data. Our results reproduce data on premovement activity in motor and premotor cortex, and suggest that synaptic plasticity may be a more rapid and powerful modulator of network activity than generally appreciated
    corecore