3,625 research outputs found

    Free energies of Boltzmann Machines: self-averaging, annealed and replica symmetric approximations in the thermodynamic limit

    Full text link
    Restricted Boltzmann machines (RBMs) constitute one of the main models for machine statistical inference and they are widely employed in Artificial Intelligence as powerful tools for (deep) learning. However, in contrast with countless remarkable practical successes, their mathematical formalization has been largely elusive: from a statistical-mechanics perspective these systems display the same (random) Gibbs measure of bi-partite spin-glasses, whose rigorous treatment is notoriously difficult. In this work, beyond providing a brief review on RBMs from both the learning and the retrieval perspectives, we aim to contribute to their analytical investigation, by considering two distinct realizations of their weights (i.e., Boolean and Gaussian) and studying the properties of their related free energies. More precisely, focusing on a RBM characterized by digital couplings, we first extend the Pastur-Shcherbina-Tirozzi method (originally developed for the Hopfield model) to prove the self-averaging property for the free energy, over its quenched expectation, in the infinite volume limit, then we explicitly calculate its simplest approximation, namely its annealed bound. Next, focusing on a RBM characterized by analogical weights, we extend Guerra's interpolating scheme to obtain a control of the quenched free-energy under the assumption of replica symmetry: we get self-consistencies for the order parameters (in full agreement with the existing Literature) as well as the critical line for ergodicity breaking that turns out to be the same obtained in AGS theory. As we discuss, this analogy stems from the slow-noise universality. Finally, glancing beyond replica symmetry, we analyze the fluctuations of the overlaps for an estimate of the (slow) noise affecting the retrieval of the signal, and by a stability analysis we recover the Aizenman-Contucci identities typical of glassy systems.Comment: 21 pages, 1 figur

    Statistical Physics and Representations in Real and Artificial Neural Networks

    Full text link
    This document presents the material of two lectures on statistical physics and neural representations, delivered by one of us (R.M.) at the Fundamental Problems in Statistical Physics XIV summer school in July 2017. In a first part, we consider the neural representations of space (maps) in the hippocampus. We introduce an extension of the Hopfield model, able to store multiple spatial maps as continuous, finite-dimensional attractors. The phase diagram and dynamical properties of the model are analyzed. We then show how spatial representations can be dynamically decoded using an effective Ising model capturing the correlation structure in the neural data, and compare applications to data obtained from hippocampal multi-electrode recordings and by (sub)sampling our attractor model. In a second part, we focus on the problem of learning data representations in machine learning, in particular with artificial neural networks. We start by introducing data representations through some illustrations. We then analyze two important algorithms, Principal Component Analysis and Restricted Boltzmann Machines, with tools from statistical physics
    corecore