261 research outputs found

    A walk in the statistical mechanical formulation of neural networks

    Full text link
    Neural networks are nowadays both powerful operational tools (e.g., for pattern recognition, data mining, error correction codes) and complex theoretical models on the focus of scientific investigation. As for the research branch, neural networks are handled and studied by psychologists, neurobiologists, engineers, mathematicians and theoretical physicists. In particular, in theoretical physics, the key instrument for the quantitative analysis of neural networks is statistical mechanics. From this perspective, here, we first review attractor networks: starting from ferromagnets and spin-glass models, we discuss the underlying philosophy and we recover the strand paved by Hopfield, Amit-Gutfreund-Sompolinky. One step forward, we highlight the structural equivalence between Hopfield networks (modeling retrieval) and Boltzmann machines (modeling learning), hence realizing a deep bridge linking two inseparable aspects of biological and robotic spontaneous cognition. As a sideline, in this walk we derive two alternative (with respect to the original Hebb proposal) ways to recover the Hebbian paradigm, stemming from ferromagnets and from spin-glasses, respectively. Further, as these notes are thought of for an Engineering audience, we highlight also the mappings between ferromagnets and operational amplifiers and between antiferromagnets and flip-flops (as neural networks -built by op-amp and flip-flops- are particular spin-glasses and the latter are indeed combinations of ferromagnets and antiferromagnets), hoping that such a bridge plays as a concrete prescription to capture the beauty of robotics from the statistical mechanical perspective.Comment: Contribute to the proceeding of the conference: NCTA 2014. Contains 12 pages,7 figure

    Phase Diagram of Restricted Boltzmann Machines and Generalised Hopfield Networks with Arbitrary Priors

    Get PDF
    Restricted Boltzmann Machines are described by the Gibbs measure of a bipartite spin glass, which in turn corresponds to the one of a generalised Hopfield network. This equivalence allows us to characterise the state of these systems in terms of retrieval capabilities, both at low and high load. We study the paramagnetic-spin glass and the spin glass-retrieval phase transitions, as the pattern (i.e. weight) distribution and spin (i.e. unit) priors vary smoothly from Gaussian real variables to Boolean discrete variables. Our analysis shows that the presence of a retrieval phase is robust and not peculiar to the standard Hopfield model with Boolean patterns. The retrieval region is larger when the pattern entries and retrieval units get more peaked and, conversely, when the hidden units acquire a broader prior and therefore have a stronger response to high fields. Moreover, at low load retrieval always exists below some critical temperature, for every pattern distribution ranging from the Boolean to the Gaussian case.Comment: 18 pages, 9 figures; typos adde

    Phase transitions in Restricted Boltzmann Machines with generic priors

    Get PDF
    We study Generalised Restricted Boltzmann Machines with generic priors for units and weights, interpolating between Boolean and Gaussian variables. We present a complete analysis of the replica symmetric phase diagram of these systems, which can be regarded as Generalised Hopfield models. We underline the role of the retrieval phase for both inference and learning processes and we show that retrieval is robust for a large class of weight and unit priors, beyond the standard Hopfield scenario. Furthermore we show how the paramagnetic phase boundary is directly related to the optimal size of the training set necessary for good generalisation in a teacher-student scenario of unsupervised learning.Comment: 5 pages, 4 figures; extensive simulations and 2 new figures added; corrected typos; added reference

    Psychophysical identity and free energy

    Get PDF
    An approach to implementing variational Bayesian inference in biological systems is considered, under which the thermodynamic free energy of a system directly encodes its variational free energy. In the case of the brain, this assumption places constraints on the neuronal encoding of generative and recognition densities, in particular requiring a stochastic population code. The resulting relationship between thermodynamic and variational free energies is prefigured in mind-brain identity theses in philosophy and in the Gestalt hypothesis of psychophysical isomorphism.Comment: 22 pages; published as a research article on 8/5/2020 in Journal of the Royal Society Interfac
    • …
    corecore