1,381 research outputs found

    Transient dynamics for sequence processing neural networks

    Full text link
    An exact solution of the transient dynamics for a sequential associative memory model is discussed through both the path-integral method and the statistical neurodynamics. Although the path-integral method has the ability to give an exact solution of the transient dynamics, only stationary properties have been discussed for the sequential associative memory. We have succeeded in deriving an exact macroscopic description of the transient dynamics by analyzing the correlation of crosstalk noise. Surprisingly, the order parameter equations of this exact solution are completely equivalent to those of the statistical neurodynamics, which is an approximation theory that assumes crosstalk noise to obey the Gaussian distribution. In order to examine our theoretical findings, we numerically obtain cumulants of the crosstalk noise. We verify that the third- and fourth-order cumulants are equal to zero, and that the crosstalk noise is normally distributed even in the non-retrieval case. We show that the results obtained by our theory agree with those obtained by computer simulations. We have also found that the macroscopic unstable state completely coincides with the separatrix.Comment: 21 pages, 4 figure

    Attractor Metadynamics in Adapting Neural Networks

    Full text link
    Slow adaption processes, like synaptic and intrinsic plasticity, abound in the brain and shape the landscape for the neural dynamics occurring on substantially faster timescales. At any given time the network is characterized by a set of internal parameters, which are adapting continuously, albeit slowly. This set of parameters defines the number and the location of the respective adiabatic attractors. The slow evolution of network parameters hence induces an evolving attractor landscape, a process which we term attractor metadynamics. We study the nature of the metadynamics of the attractor landscape for several continuous-time autonomous model networks. We find both first- and second-order changes in the location of adiabatic attractors and argue that the study of the continuously evolving attractor landscape constitutes a powerful tool for understanding the overall development of the neural dynamics

    A Survey on Continuous Time Computations

    Full text link
    We provide an overview of theories of continuous time computation. These theories allow us to understand both the hardness of questions related to continuous time dynamical systems and the computational power of continuous time analog models. We survey the existing models, summarizing results, and point to relevant references in the literature

    Sentient Networks

    Full text link
    In this paper we consider the question whether a distributed network of sensors and data processors can form "perceptions" based on the sensory data. Because sensory data can have exponentially many explanations, the use of a central data processor to analyze the outputs from a large ensemble of sensors will in general introduce unacceptable latencies for responding to dangerous situations. A better idea is to use a distributed "Helmholtz machine" architecture in which the collective state of the network as a whole provides an explanation for the sensory data.Comment: PostScript, 14 page

    Short term synaptic depression improves information transfer in perceptual multistability

    Get PDF
    Competitive neural networks are often used to model the dynamics of perceptual bistability. Switching between percepts can occur through fluctuations and/or a slow adaptive process. Here, we analyze switching statistics in competitive networks with short term synaptic depression and noise. We start by analyzing a ring model that yields spatially structured solutions and complement this with a study of a space-free network whose populations are coupled with mutual inhibition. Dominance times arising from depression driven switching can be approximated using a separation of timescales in the ring and space-free model. For purely noise-driven switching, we use energy arguments to justify how dominance times are exponentially related to input strength. We also show that a combination of depression and noise generates realistic distributions of dominance times. Unimodal functions of dominance times are more easily differentiated from one another using Bayesian sampling, suggesting synaptic depression induced switching transfers more information about stimuli than noise-driven switching. Finally, we analyze a competitive network model of perceptual tristability, showing depression generates a memory of previous percepts based on the ordering of percepts.Comment: 26 pages, 15 figure
    corecore