6 research outputs found

    Stochastic thermodynamics of learning

    Get PDF
    Unravelling the physical limits of information processing is an important goal of non-equilibrium statistical physics. It is motivated by the search for fundamental limits of computation, such as Landauer's bound on the minimal work required to erase one bit of information. Further inspiration comes from biology, where we would like to understand what makes single cells or the human brain so (energy-)efficient at processing information. In this thesis, we analyse the thermodynamic efficiency of learning in neural networks. We first discuss the interplay of information processing and dissipation from the perspective of stochastic thermodynamics, a powerful framework to analyse the thermodynamics of strongly fluctuating systems far from equilibrium. We then show that the dissipation of any physical system, in particular a neural network, bounds the information that the network can infer from data or learn from a teacher. Along the way, we illustrate our thermodynamic bounds by looking at a number of examples and we outline directions for future research

    Role of biases in neural network models

    Get PDF
    corecore