17,581 research outputs found

    Symmetric sequence processing in a recurrent neural network model with a synchronous dynamics

    Full text link
    The synchronous dynamics and the stationary states of a recurrent attractor neural network model with competing synapses between symmetric sequence processing and Hebbian pattern reconstruction is studied in this work allowing for the presence of a self-interaction for each unit. Phase diagrams of stationary states are obtained exhibiting phases of retrieval, symmetric and period-two cyclic states as well as correlated and frozen-in states, in the absence of noise. The frozen-in states are destabilised by synaptic noise and well separated regions of correlated and cyclic states are obtained. Excitatory or inhibitory self-interactions yield enlarged phases of fixed-point or cyclic behaviour.Comment: Accepted for publication in Journal of Physics A: Mathematical and Theoretica

    Instability of frozen-in states in synchronous Hebbian neural networks

    Full text link
    The full dynamics of a synchronous recurrent neural network model with Ising binary units and a Hebbian learning rule with a finite self-interaction is studied in order to determine the stability to synaptic and stochastic noise of frozen-in states that appear in the absence of both kinds of noise. Both, the numerical simulation procedure of Eissfeller and Opper and a new alternative procedure that allows to follow the dynamics over larger time scales have been used in this work. It is shown that synaptic noise destabilizes the frozen-in states and yields either retrieval or paramagnetic states for not too large stochastic noise. The indications are that the same results may follow in the absence of synaptic noise, for low stochastic noise.Comment: 14 pages and 4 figures; accepted for publication in J. Phys. A: Math. Ge

    Period-two cycles in a feed-forward layered neural network model with symmetric sequence processing

    Full text link
    The effects of dominant sequential interactions are investigated in an exactly solvable feed-forward layered neural network model of binary units and patterns near saturation in which the interaction consists of a Hebbian part and a symmetric sequential term. Phase diagrams of stationary states are obtained and a new phase of cyclic correlated states of period two is found for a weak Hebbian term, independently of the number of condensed patterns cc.Comment: 8 pages and 5 figure

    Spatially structured oscillations in a two-dimensional excitatory neuronal network with synaptic depression

    Get PDF
    We study the spatiotemporal dynamics of a two-dimensional excitatory neuronal network with synaptic depression. Coupling between populations of neurons is taken to be nonlocal, while depression is taken to be local and presynaptic. We show that the network supports a wide range of spatially structured oscillations, which are suggestive of phenomena seen in cortical slice experiments and in vivo. The particular form of the oscillations depends on initial conditions and the level of background noise. Given an initial, spatially localized stimulus, activity evolves to a spatially localized oscillating core that periodically emits target waves. Low levels of noise can spontaneously generate several pockets of oscillatory activity that interact via their target patterns. Periodic activity in space can also organize into spiral waves, provided that there is some source of rotational symmetry breaking due to external stimuli or noise. In the high gain limit, no oscillatory behavior exists, but a transient stimulus can lead to a single, outward propagating target wave

    Short term synaptic depression improves information transfer in perceptual multistability

    Get PDF
    Competitive neural networks are often used to model the dynamics of perceptual bistability. Switching between percepts can occur through fluctuations and/or a slow adaptive process. Here, we analyze switching statistics in competitive networks with short term synaptic depression and noise. We start by analyzing a ring model that yields spatially structured solutions and complement this with a study of a space-free network whose populations are coupled with mutual inhibition. Dominance times arising from depression driven switching can be approximated using a separation of timescales in the ring and space-free model. For purely noise-driven switching, we use energy arguments to justify how dominance times are exponentially related to input strength. We also show that a combination of depression and noise generates realistic distributions of dominance times. Unimodal functions of dominance times are more easily differentiated from one another using Bayesian sampling, suggesting synaptic depression induced switching transfers more information about stimuli than noise-driven switching. Finally, we analyze a competitive network model of perceptual tristability, showing depression generates a memory of previous percepts based on the ordering of percepts.Comment: 26 pages, 15 figure

    Global analysis of parallel analog networks with retarded feedback

    Get PDF
    We analyze the retrieval dynamics of analog ‘‘neural’’ networks with clocked sigmoid elements and multiple signal delays. Proving a conjecture by Marcus and Westervelt, we show that for delay-independent symmetric coupling strengths, the only attractors are fixed points and periodic limit cycles. The same result applies to a larger class of asymmetric networks that may be utilized to store temporal associations with a cyclic structure. We discuss implications for various learning schemes in the space-time domain

    The complexity of dynamics in small neural circuits

    Full text link
    Mean-field theory is a powerful tool for studying large neural networks. However, when the system is composed of a few neurons, macroscopic differences between the mean-field approximation and the real behavior of the network can arise. Here we introduce a study of the dynamics of a small firing-rate network with excitatory and inhibitory populations, in terms of local and global bifurcations of the neural activity. Our approach is analytically tractable in many respects, and sheds new light on the finite-size effects of the system. In particular, we focus on the formation of multiple branching solutions of the neural equations through spontaneous symmetry-breaking, since this phenomenon increases considerably the complexity of the dynamical behavior of the network. For these reasons, branching points may reveal important mechanisms through which neurons interact and process information, which are not accounted for by the mean-field approximation.Comment: 34 pages, 11 figures. Supplementary materials added, colors of figures 8 and 9 fixed, results unchange

    Relaxation, closing probabilities and transition from oscillatory to chaotic attractors in asymmetric neural networks

    Full text link
    Attractors in asymmetric neural networks with deterministic parallel dynamics were shown to present a "chaotic" regime at symmetry eta < 0.5, where the average length of the cycles increases exponentially with system size, and an oscillatory regime at high symmetry, where the typical length of the cycles is 2. We show, both with analytic arguments and numerically, that there is a sharp transition, at a critical symmetry \e_c=0.33, between a phase where the typical cycles have length 2 and basins of attraction of vanishing weight and a phase where the typical cycles are exponentially long with system size, and the weights of their attraction basins are distributed as in a Random Map with reversal symmetry. The time-scale after which cycles are reached grows exponentially with system size NN, and the exponent vanishes in the symmetric limit, where TN2/3T\propto N^{2/3}. The transition can be related to the dynamics of the infinite system (where cycles are never reached), using the closing probabilities as a tool. We also study the relaxation of the function E(t)=1/Nihi(t)E(t)=-1/N\sum_i |h_i(t)|, where hih_i is the local field experienced by the neuron ii. In the symmetric system, it plays the role of a Ljapunov function which drives the system towards its minima through steepest descent. This interpretation survives, even if only on the average, also for small asymmetry. This acts like an effective temperature: the larger is the asymmetry, the faster is the relaxation of EE, and the higher is the asymptotic value reached. EE reachs very deep minima in the fixed points of the dynamics, which are reached with vanishing probability, and attains a larger value on the typical attractors, which are cycles of length 2.Comment: 24 pages, 9 figures, accepted on Journal of Physics A: Math. Ge

    Diversity of emergent dynamics in competitive threshold-linear networks: a preliminary report

    Full text link
    Threshold-linear networks consist of simple units interacting in the presence of a threshold nonlinearity. Competitive threshold-linear networks have long been known to exhibit multistability, where the activity of the network settles into one of potentially many steady states. In this work, we find conditions that guarantee the absence of steady states, while maintaining bounded activity. These conditions lead us to define a combinatorial family of competitive threshold-linear networks, parametrized by a simple directed graph. By exploring this family, we discover that threshold-linear networks are capable of displaying a surprisingly rich variety of nonlinear dynamics, including limit cycles, quasiperiodic attractors, and chaos. In particular, several types of nonlinear behaviors can co-exist in the same network. Our mathematical results also enable us to engineer networks with multiple dynamic patterns. Taken together, these theoretical and computational findings suggest that threshold-linear networks may be a valuable tool for understanding the relationship between network connectivity and emergent dynamics.Comment: 12 pages, 9 figures. Preliminary repor

    Synchronization of electrically coupled resonate-and-fire neurons

    Full text link
    Electrical coupling between neurons is broadly present across brain areas and is typically assumed to synchronize network activity. However, intrinsic properties of the coupled cells can complicate this simple picture. Many cell types with strong electrical coupling have been shown to exhibit resonant properties, and the subthreshold fluctuations arising from resonance are transmitted through electrical synapses in addition to action potentials. Using the theory of weakly coupled oscillators, we explore the effect of both subthreshold and spike-mediated coupling on synchrony in small networks of electrically coupled resonate-and-fire neurons, a hybrid neuron model with linear subthreshold dynamics and discrete post-spike reset. We calculate the phase response curve using an extension of the adjoint method that accounts for the discontinuity in the dynamics. We find that both spikes and resonant subthreshold fluctuations can jointly promote synchronization. The subthreshold contribution is strongest when the voltage exhibits a significant post-spike elevation in voltage, or plateau. Additionally, we show that the geometry of trajectories approaching the spiking threshold causes a "reset-induced shear" effect that can oppose synchrony in the presence of network asymmetry, despite having no effect on the phase-locking of symmetrically coupled pairs
    corecore