74 research outputs found

    Analog Realization of Arbitrary One-Dimensional Maps

    Get PDF
    An increasing number of applications of a one-dimensional (1-D) map as an information processing element are found in the literature on artificial neural networks, image processing systems, and secure communication systems. In search of an efficient hardware implementation of a 1-D map, we discovered that the bifurcating neuron (BN), which was introduced elsewhere as a mathematical model of a biological neuron under the influence of an external sinusoidal signal, could provide a compact solution. The original work on the BN indicated that its firing time sequence, when it was subject to a sinusoidal driving signal, was related to the sine-circle map, suggesting that the BN can compute the sine-circle map. Despite its rich array of dynamical properties, the mathematical description of the BN is simple enough to lend itself to a compact circuit implementation. In this paper, we generalize the original work and show that the computational power of the BN can be extended to compute an arbitrary 1-D map. Also, we describe two possible circuit models of the BN: the programmable unijunction transistor oscillator neuron, which was introduced in the original work as a circuit model of the BN, and the integrated-circuit relaxation oscillator neuron (IRON), which was developed for more precise modeling of the BN. To demonstrate the computational power of the BN, we use the IRON to generate the bifurcation diagrams of the sine-circle map, the logistic map, as well as the tent map, and then compare them with exact numerical versions. The programming of the BN to compute an arbitrary map can be done simply by changing the waveform of the driving signal, which is given to the BN externally; this feature makes the circuit models of the BN especially useful in the circuit implementation of a network of 1-D maps

    Synchrony-induced modes of oscillation of a neural field model

    Get PDF
    We investigate the modes of oscillation of heterogeneous ring-networks of quadratic integrate-and-fire (QIF) neurons with non-local, space-dependent coupling. Perturbations of the equilibrium state with a particular wave number produce transient standing waves with a specific temporal frequency, analogous to those in a tense string. In the neuronal network, the equilibrium corresponds to a spatially homogeneous, asynchronous state. Perturbations of this state excite the network’s oscillatory modes, which reflect the interplay of episodes of synchronous spiking with the excitatory-inhibitory spatial interactions. In the thermodynamic limit, an exact low-dimensional neural field model (QIF-NFM) describing the macroscopic dynamics of the network is derived. This allows us to obtain formulas for the Turing eigenvalues of the spatially-homogeneous state, and hence to obtain its stability boundary. We find that the frequency of each Turing mode depends on the corresponding Fourier coefficient of the synaptic pattern of connectivity. The decay rate instead, is identical for all oscillation modes as a consequence of the heterogeneity-induced desynchronization of the neurons. Finally, we numerically compute the spectrum of spatially-inhomogeneous solutions branching from the Turing bifurcation, showing that similar oscillatory modes operate in neural bump states, and are maintained away from onset

    Mechanisms of Zero-Lag Synchronization in Cortical Motifs

    Get PDF
    Zero-lag synchronization between distant cortical areas has been observed in a diversity of experimental data sets and between many different regions of the brain. Several computational mechanisms have been proposed to account for such isochronous synchronization in the presence of long conduction delays: Of these, the phenomenon of "dynamical relaying" - a mechanism that relies on a specific network motif - has proven to be the most robust with respect to parameter mismatch and system noise. Surprisingly, despite a contrary belief in the community, the common driving motif is an unreliable means of establishing zero-lag synchrony. Although dynamical relaying has been validated in empirical and computational studies, the deeper dynamical mechanisms and comparison to dynamics on other motifs is lacking. By systematically comparing synchronization on a variety of small motifs, we establish that the presence of a single reciprocally connected pair - a "resonance pair" - plays a crucial role in disambiguating those motifs that foster zero-lag synchrony in the presence of conduction delays (such as dynamical relaying) from those that do not (such as the common driving triad). Remarkably, minor structural changes to the common driving motif that incorporate a reciprocal pair recover robust zero-lag synchrony. The findings are observed in computational models of spiking neurons, populations of spiking neurons and neural mass models, and arise whether the oscillatory systems are periodic, chaotic, noise-free or driven by stochastic inputs. The influence of the resonance pair is also robust to parameter mismatch and asymmetrical time delays amongst the elements of the motif. We call this manner of facilitating zero-lag synchrony resonance-induced synchronization, outline the conditions for its occurrence, and propose that it may be a general mechanism to promote zero-lag synchrony in the brain.Comment: 41 pages, 12 figures, and 11 supplementary figure

    Analog realization of arbitrary one-dimensional maps

    Full text link

    分散耐環境ナノ電子デバイスの研究

    Get PDF

    Corticonic models of brain mechanisms underlying cognition and intelligence

    Get PDF
    The concern of this review is brain theory or more specifically, in its first part, a model of the cerebral cortex and the way it:(a) interacts with subcortical regions like the thalamus and the hippocampus to provide higher-level-brain functions that underlie cognition and intelligence, (b) handles and represents dynamical sensory patterns imposed by a constantly changing environment, (c) copes with the enormous number of such patterns encountered in a lifetime bymeans of dynamic memory that offers an immense number of stimulus-specific attractors for input patterns (stimuli) to select from, (d) selects an attractor through a process of “conjugation” of the input pattern with the dynamics of the thalamo–cortical loop, (e) distinguishes between redundant (structured)and non-redundant (random) inputs that are void of information, (f) can do categorical perception when there is access to vast associative memory laid out in the association cortex with the help of the hippocampus, and (g) makes use of “computation” at the edge of chaos and information driven annealing to achieve all this. Other features and implications of the concepts presented for the design of computational algorithms and machines with brain-like intelligence are also discussed. The material and results presented suggest, that a Parametrically Coupled Logistic Map network (PCLMN) is a minimal model of the thalamo–cortical complex and that marrying such a network to a suitable associative memory with re-entry or feedback forms a useful, albeit, abstract model of a cortical module of the brain that could facilitate building a simple artificial brain. In the second part of the review, the results of numerical simulations and drawn conclusions in the first part are linked to the most directly relevant works and views of other workers. What emerges is a picture of brain dynamics on the mesoscopic and macroscopic scales that gives a glimpse of the nature of the long sought after brain code underlying intelligence and other higher level brain functions. Physics of Life Reviews 4 (2007) 223–252 © 2007 Elsevier B.V. All rights reserved

    Analyzing the competition of gamma rhythms with delayed pulse-coupled oscillators in phase representation

    Get PDF
    Contains fulltext : 194982.pdf (preprint version ) (Open Access)25 p

    Dimension Reduction of Neural Models Across Multiple Spatio-temporal Scales

    Get PDF
    In general, reducing the dimensionality of a complex model is a natural first step to gaining insight into the system. In this dissertation, we reduce the dimensions of models at three different scales: first at the scale of microscopic single-neurons, second at the scale of macroscopic infinite neurons, and third at an in-between spatial scale of finite neural populations. Each model also exhibits a separation of timescales, making them amenable to the method of multiple timescales, which is the primary dimension-reduction tool of this dissertation. In the first case, the method of multiple timescales reduces the dynamics of two coupled n-dimensional neurons into one scalar differential equation representing the slow timescale phase-locking properties of the oscillators as a function of an exogenous slowly varying parameter. This result extends the classic theory of weakly coupled oscillators. In the second case, the method reduces the many spatio-temporal \yp{dynamics of} ``bump'' solutions of a neural field model into its scalar coordinates, which are much easier to analyze analytically. This result generalizes existing studies on neural field spatio-temporal dynamics to the case of a smooth firing rate function and general even kernel. In the third case, we reduce the dimension of the oscillators at the spiking level -- similar to the first case -- but with additional slowly varying synaptic variables. This result generalizes existing studies that use scalar oscillators and the Ott-Antonsen ansatz to reduce the dimensionality and determine the synchronization properties of large neural populations
    corecore