16,735 research outputs found

    Neural Distributed Autoassociative Memories: A Survey

    Full text link
    Introduction. Neural network models of autoassociative, distributed memory allow storage and retrieval of many items (vectors) where the number of stored items can exceed the vector dimension (the number of neurons in the network). This opens the possibility of a sublinear time search (in the number of stored items) for approximate nearest neighbors among vectors of high dimension. The purpose of this paper is to review models of autoassociative, distributed memory that can be naturally implemented by neural networks (mainly with local learning rules and iterative dynamics based on information locally available to neurons). Scope. The survey is focused mainly on the networks of Hopfield, Willshaw and Potts, that have connections between pairs of neurons and operate on sparse binary vectors. We discuss not only autoassociative memory, but also the generalization properties of these networks. We also consider neural networks with higher-order connections and networks with a bipartite graph structure for non-binary data with linear constraints. Conclusions. In conclusion we discuss the relations to similarity search, advantages and drawbacks of these techniques, and topics for further research. An interesting and still not completely resolved question is whether neural autoassociative memories can search for approximate nearest neighbors faster than other index structures for similarity search, in particular for the case of very high dimensional vectors.Comment: 31 page

    Statistical physics of neural systems with non-additive dendritic coupling

    Full text link
    How neurons process their inputs crucially determines the dynamics of biological and artificial neural networks. In such neural and neural-like systems, synaptic input is typically considered to be merely transmitted linearly or sublinearly by the dendritic compartments. Yet, single-neuron experiments report pronounced supralinear dendritic summation of sufficiently synchronous and spatially close-by inputs. Here, we provide a statistical physics approach to study the impact of such non-additive dendritic processing on single neuron responses and the performance of associative memory tasks in artificial neural networks. First, we compute the effect of random input to a neuron incorporating nonlinear dendrites. This approach is independent of the details of the neuronal dynamics. Second, we use those results to study the impact of dendritic nonlinearities on the network dynamics in a paradigmatic model for associative memory, both numerically and analytically. We find that dendritic nonlinearities maintain network convergence and increase the robustness of memory performance against noise. Interestingly, an intermediate number of dendritic branches is optimal for memory functionality

    Dreaming neural networks: forgetting spurious memories and reinforcing pure ones

    Full text link
    The standard Hopfield model for associative neural networks accounts for biological Hebbian learning and acts as the harmonic oscillator for pattern recognition, however its maximal storage capacity is α0.14\alpha \sim 0.14, far from the theoretical bound for symmetric networks, i.e. α=1\alpha =1. Inspired by sleeping and dreaming mechanisms in mammal brains, we propose an extension of this model displaying the standard on-line (awake) learning mechanism (that allows the storage of external information in terms of patterns) and an off-line (sleep) unlearning&\&consolidating mechanism (that allows spurious-pattern removal and pure-pattern reinforcement): this obtained daily prescription is able to saturate the theoretical bound α=1\alpha=1, remaining also extremely robust against thermal noise. Both neural and synaptic features are analyzed both analytically and numerically. In particular, beyond obtaining a phase diagram for neural dynamics, we focus on synaptic plasticity and we give explicit prescriptions on the temporal evolution of the synaptic matrix. We analytically prove that our algorithm makes the Hebbian kernel converge with high probability to the projection matrix built over the pure stored patterns. Furthermore, we obtain a sharp and explicit estimate for the "sleep rate" in order to ensure such a convergence. Finally, we run extensive numerical simulations (mainly Monte Carlo sampling) to check the approximations underlying the analytical investigations (e.g., we developed the whole theory at the so called replica-symmetric level, as standard in the Amit-Gutfreund-Sompolinsky reference framework) and possible finite-size effects, finding overall full agreement with the theory.Comment: 31 pages, 12 figure

    Associative memory on a small-world neural network

    Full text link
    We study a model of associative memory based on a neural network with small-world structure. The efficacy of the network to retrieve one of the stored patterns exhibits a phase transition at a finite value of the disorder. The more ordered networks are unable to recover the patterns, and are always attracted to mixture states. Besides, for a range of the number of stored patterns, the efficacy has a maximum at an intermediate value of the disorder. We also give a statistical characterization of the attractors for all values of the disorder of the network.Comment: 5 pages, 4 figures (eps

    Spin-Mediated Consciousness Theory: An Approach Based On Pan-Protopsychism

    Get PDF
    As an alternative to our original dualistic approach, we present here our spin-mediated consciousness theory based on pan-protopsychism. We postulate that consciousness is intrinsically connected to quantum mechanical spin since said spin is embedded in the microscopic structure of spacetime and may be more fundamental than spacetime itself. Thus, we theorize that consciousness emerges quantum mechanically from the collective dynamics of "protopsychic" spins under the influence of spacetime dynamics. That is, spin is the "pixel" of mind. The unity of mind is achieved by quantum entanglement of the mind-pixels. Applying these ideas to the particular structures and dynamics of the brain, we postulate that the human mind works as follows: The nuclear spin ensembles ("NSE") in both neural membranes and proteins quantum mechanically process consciousness-related information such that conscious experience emerges from the collapses of entangled quantum states of NSE under the influence of the underlying spacetime dynamics. Said information is communicated to NSE through strong spin-spin couplings by biologically available unpaired electronic spins such as those carried by rapidly diffusing oxygen molecules and neural transmitter nitric oxides that extract information from their diffusing pathways in the brain. In turn, the dynamics of NSE has effects through spin chemistry on the classical neural activities such as action potentials and receptor functions thus influencing the classical neural networks of said brain. We also present supporting evidence and make important predictions. We stress that our theory is experimentally verifiable with present technologies
    corecore