34,318 research outputs found

    Statistical physics of neural systems with non-additive dendritic coupling

    Full text link
    How neurons process their inputs crucially determines the dynamics of biological and artificial neural networks. In such neural and neural-like systems, synaptic input is typically considered to be merely transmitted linearly or sublinearly by the dendritic compartments. Yet, single-neuron experiments report pronounced supralinear dendritic summation of sufficiently synchronous and spatially close-by inputs. Here, we provide a statistical physics approach to study the impact of such non-additive dendritic processing on single neuron responses and the performance of associative memory tasks in artificial neural networks. First, we compute the effect of random input to a neuron incorporating nonlinear dendrites. This approach is independent of the details of the neuronal dynamics. Second, we use those results to study the impact of dendritic nonlinearities on the network dynamics in a paradigmatic model for associative memory, both numerically and analytically. We find that dendritic nonlinearities maintain network convergence and increase the robustness of memory performance against noise. Interestingly, an intermediate number of dendritic branches is optimal for memory functionality

    Storage Capacity Diverges with Synaptic Efficiency in an Associative Memory Model with Synaptic Delay and Pruning

    Full text link
    It is known that storage capacity per synapse increases by synaptic pruning in the case of a correlation-type associative memory model. However, the storage capacity of the entire network then decreases. To overcome this difficulty, we propose decreasing the connecting rate while keeping the total number of synapses constant by introducing delayed synapses. In this paper, a discrete synchronous-type model with both delayed synapses and their prunings is discussed as a concrete example of the proposal. First, we explain the Yanai-Kim theory by employing the statistical neurodynamics. This theory involves macrodynamical equations for the dynamics of a network with serial delay elements. Next, considering the translational symmetry of the explained equations, we re-derive macroscopic steady state equations of the model by using the discrete Fourier transformation. The storage capacities are analyzed quantitatively. Furthermore, two types of synaptic prunings are treated analytically: random pruning and systematic pruning. As a result, it becomes clear that in both prunings, the storage capacity increases as the length of delay increases and the connecting rate of the synapses decreases when the total number of synapses is constant. Moreover, an interesting fact becomes clear: the storage capacity asymptotically approaches 2/Ï€2/\pi due to random pruning. In contrast, the storage capacity diverges in proportion to the logarithm of the length of delay by systematic pruning and the proportion constant is 4/Ï€4/\pi. These results theoretically support the significance of pruning following an overgrowth of synapses in the brain and strongly suggest that the brain prefers to store dynamic attractors such as sequences and limit cycles rather than equilibrium states.Comment: 27 pages, 14 figure

    Adaptive optical networks using photorefractive crystals

    Get PDF
    The capabilities of photorefractive crystals as media for holographic interconnections in neural networks are examined. Limitations on the density of interconnections and the number of holographic associations which can be stored in photorefractive crystals are derived. Optical architectures for implementing various neural schemes are described. Experimental results are presented for one of these architectures

    Synapse efficiency diverges due to synaptic pruning following over-growth

    Full text link
    In the development of the brain, it is known that synapses are pruned following over-growth. This pruning following over-growth seems to be a universal phenomenon that occurs in almost all areas -- visual cortex, motor area, association area, and so on. It has been shown numerically that the synapse efficiency is increased by systematic deletion. We discuss the synapse efficiency to evaluate the effect of pruning following over-growth, and analytically show that the synapse efficiency diverges as O(log c) at the limit where connecting rate c is extremely small. Under a fixed synapse number criterion, the optimal connecting rate, which maximize memory performance, exists.Comment: 15 pages, 16 figure
    • …
    corecore