Exploring the Temperature-Dependent Phase Transition in Modern Hopfield Networks

Abstract

The recent discovery of a connection between Transformers and Modern Hopfield Networks (MHNs) has reignited the study of neural networks from a physical energy-based perspective. This paper focuses on the pivotal effect of the inverse temperature hyperparameter β\beta on the distribution of energy minima of the MHN. To achieve this, the distribution of energy minima is tracked in a simplified MHN in which equidistant normalised patterns are stored. This network demonstrates a phase transition at a critical temperature βc\beta_{\text{c}}, from a single global attractor towards highly pattern specific minima as β\beta is increased. Importantly, the dynamics are not solely governed by the hyperparameter β\beta but are instead determined by an effective inverse temperature βeff\beta_{\text{eff}} which also depends on the distribution and size of the stored patterns. Recognizing the role of hyperparameters in the MHN could, in the future, aid researchers in the domain of Transformers to optimise their initial choices, potentially reducing the necessity for time and energy expensive hyperparameter fine-tuning.Comment: Accepted as poster for Associative Memory and Hopfield Networks workshop at NeurIPS2

    Similar works

    Full text

    thumbnail-image

    Available Versions