3,638 research outputs found

    The Hopfield model and its role in the development of synthetic biology

    Get PDF
    Neural network models make extensive use of concepts coming from physics and engineering. How do scientists justify the use of these concepts in the representation of biological systems? How is evidence for or against the use of these concepts produced in the application and manipulation of the models? It will be shown in this article that neural network models are evaluated differently depending on the scientific context and its modeling practice. In the case of the Hopfield model, the different modeling practices related to theoretical physics and neurobiology played a central role for how the model was received and used in the different scientific communities. In theoretical physics, where the Hopfield model has its roots, mathematical modeling is much more common and established than in neurobiology which is strongly experiment driven. These differences in modeling practice contributed to the development of the new field of synthetic biology which introduced a third type of model which combines mathematical modeling and experimenting on biological systems and by doing so mediates between the different modeling practices

    High capacity associative memory with bipolar and binary, biased patterns

    Get PDF
    The high capacity associative memory model is interesting due to its significantly higher capacity when compared with the standard Hopfield model. These networks can use either bipolar or binary patterns, which may also be biased. This paper investigates the performance of a high capacity associative memory model trained with biased patterns, using either bipolar or binary representations. Our results indicate that the binary network performs less well under low bias, but better in other situations, compared with the bipolar network.Peer reviewe

    Adiabatic Quantum Optimization for Associative Memory Recall

    Get PDF
    Hopfield networks are a variant of associative memory that recall information stored in the couplings of an Ising model. Stored memories are fixed points for the network dynamics that correspond to energetic minima of the spin state. We formulate the recall of memories stored in a Hopfield network using energy minimization by adiabatic quantum optimization (AQO). Numerical simulations of the quantum dynamics allow us to quantify the AQO recall accuracy with respect to the number of stored memories and the noise in the input key. We also investigate AQO performance with respect to how memories are stored in the Ising model using different learning rules. Our results indicate that AQO performance varies strongly with learning rule due to the changes in the energy landscape. Consequently, learning rules offer indirect methods for investigating change to the computational complexity of the recall task and the computational efficiency of AQO.Comment: 22 pages, 11 figures. Updated for clarity and figures, to appear in Frontiers of Physic

    Statistical physics of neural systems with non-additive dendritic coupling

    Full text link
    How neurons process their inputs crucially determines the dynamics of biological and artificial neural networks. In such neural and neural-like systems, synaptic input is typically considered to be merely transmitted linearly or sublinearly by the dendritic compartments. Yet, single-neuron experiments report pronounced supralinear dendritic summation of sufficiently synchronous and spatially close-by inputs. Here, we provide a statistical physics approach to study the impact of such non-additive dendritic processing on single neuron responses and the performance of associative memory tasks in artificial neural networks. First, we compute the effect of random input to a neuron incorporating nonlinear dendrites. This approach is independent of the details of the neuronal dynamics. Second, we use those results to study the impact of dendritic nonlinearities on the network dynamics in a paradigmatic model for associative memory, both numerically and analytically. We find that dendritic nonlinearities maintain network convergence and increase the robustness of memory performance against noise. Interestingly, an intermediate number of dendritic branches is optimal for memory functionality
    • …
    corecore