2,420 research outputs found

    Effect of dilution in asymmetric recurrent neural networks

    Full text link
    We study with numerical simulation the possible limit behaviors of synchronous discrete-time deterministic recurrent neural networks composed of N binary neurons as a function of a network's level of dilution and asymmetry. The network dilution measures the fraction of neuron couples that are connected, and the network asymmetry measures to what extent the underlying connectivity matrix is asymmetric. For each given neural network, we study the dynamical evolution of all the different initial conditions, thus characterizing the full dynamical landscape without imposing any learning rule. Because of the deterministic dynamics, each trajectory converges to an attractor, that can be either a fixed point or a limit cycle. These attractors form the set of all the possible limit behaviors of the neural network. For each network, we then determine the convergence times, the limit cycles' length, the number of attractors, and the sizes of the attractors' basin. We show that there are two network structures that maximize the number of possible limit behaviors. The first optimal network structure is fully-connected and symmetric. On the contrary, the second optimal network structure is highly sparse and asymmetric. The latter optimal is similar to what observed in different biological neuronal circuits. These observations lead us to hypothesize that independently from any given learning model, an efficient and effective biologic network that stores a number of limit behaviors close to its maximum capacity tends to develop a connectivity structure similar to one of the optimal networks we found.Comment: 31 pages, 5 figure

    On the number of limit cycles in asymmetric neural networks

    Full text link
    The comprehension of the mechanisms at the basis of the functioning of complexly interconnected networks represents one of the main goals of neuroscience. In this work, we investigate how the structure of recurrent connectivity influences the ability of a network to have storable patterns and in particular limit cycles, by modeling a recurrent neural network with McCulloch-Pitts neurons as a content-addressable memory system. A key role in such models is played by the connectivity matrix, which, for neural networks, corresponds to a schematic representation of the "connectome": the set of chemical synapses and electrical junctions among neurons. The shape of the recurrent connectivity matrix plays a crucial role in the process of storing memories. This relation has already been exposed by the work of Tanaka and Edwards, which presents a theoretical approach to evaluate the mean number of fixed points in a fully connected model at thermodynamic limit. Interestingly, further studies on the same kind of model but with a finite number of nodes have shown how the symmetry parameter influences the types of attractors featured in the system. Our study extends the work of Tanaka and Edwards by providing a theoretical evaluation of the mean number of attractors of any given length LL for different degrees of symmetry in the connectivity matrices.Comment: 35 pages, 12 figure

    Multiplicative versus additive noise in multi-state neural networks

    Full text link
    The effects of a variable amount of random dilution of the synaptic couplings in Q-Ising multi-state neural networks with Hebbian learning are examined. A fraction of the couplings is explicitly allowed to be anti-Hebbian. Random dilution represents the dying or pruning of synapses and, hence, a static disruption of the learning process which can be considered as a form of multiplicative noise in the learning rule. Both parallel and sequential updating of the neurons can be treated. Symmetric dilution in the statics of the network is studied using the mean-field theory approach of statistical mechanics. General dilution, including asymmetric pruning of the couplings, is examined using the generating functional (path integral) approach of disordered systems. It is shown that random dilution acts as additive gaussian noise in the Hebbian learning rule with a mean zero and a variance depending on the connectivity of the network and on the symmetry. Furthermore, a scaling factor appears that essentially measures the average amount of anti-Hebbian couplings.Comment: 15 pages, 5 figures, to appear in the proceedings of the Conference on Noise in Complex Systems and Stochastic Dynamics II (SPIE International

    Synchronous versus sequential updating in the three-state Ising neural network with variable dilution

    Full text link
    The three-state Ising neural network with synchronous updating and variable dilution is discussed starting from the appropriate Hamiltonians. The thermodynamic and retrieval properties are examined using replica mean-field theory. Capacity-temperature phase diagrams are derived for several values of the pattern activity and different gradations of dilution, and the information content is calculated. The results are compared with those for sequential updating. The effect of self-coupling is established. Also the dynamics is studied using the generating function technique for both synchronous and sequential updating. Typical flow diagrams for the overlap order parameter are presented. The differences with the signal-to-noise approach are outlined.Comment: 21 pages Latex, 12 eps figures and 1 ps figur

    An associative network with spatially organized connectivity

    Full text link
    We investigate the properties of an autoassociative network of threshold-linear units whose synaptic connectivity is spatially structured and asymmetric. Since the methods of equilibrium statistical mechanics cannot be applied to such a network due to the lack of a Hamiltonian, we approach the problem through a signal-to-noise analysis, that we adapt to spatially organized networks. The conditions are analyzed for the appearance of stable, spatially non-uniform profiles of activity with large overlaps with one of the stored patterns. It is also shown, with simulations and analytic results, that the storage capacity does not decrease much when the connectivity of the network becomes short range. In addition, the method used here enables us to calculate exactly the storage capacity of a randomly connected network with arbitrary degree of dilution.Comment: 27 pages, 6 figures; Accepted for publication in JSTA

    Period-two cycles in a feed-forward layered neural network model with symmetric sequence processing

    Full text link
    The effects of dominant sequential interactions are investigated in an exactly solvable feed-forward layered neural network model of binary units and patterns near saturation in which the interaction consists of a Hebbian part and a symmetric sequential term. Phase diagrams of stationary states are obtained and a new phase of cyclic correlated states of period two is found for a weak Hebbian term, independently of the number of condensed patterns cc.Comment: 8 pages and 5 figure

    Finite Connectivity Attractor Neural Networks

    Full text link
    We study a family of diluted attractor neural networks with a finite average number of (symmetric) connections per neuron. As in finite connectivity spin glasses, their equilibrium properties are described by order parameter functions, for which we derive an integral equation in replica symmetric (RS) approximation. A bifurcation analysis of this equation reveals the locations of the paramagnetic to recall and paramagnetic to spin-glass transition lines in the phase diagram. The line separating the retrieval phase from the spin-glass phase is calculated at zero temperature. All phase transitions are found to be continuous.Comment: 17 pages, 4 figure
    corecore