Towards Information Theory-Based Discovery of Equivariances

Abstract

© 2023 H. Charvin, N. Catenacci Volpi & D. Polani.The presence of symmetries imposes a stringent set of constraints on a system. This constrained structure allows intelligent agents interacting with such a system to drasti- cally improve the efficiency of learning and generalization, through the internalisation of the system’s symmetries into their information-processing. In parallel, principled mod- els of complexity-constrained learning and behaviour make increasing use of information- theoretic methods. Here, we wish to marry these two perspectives and understand whether and in which form the information-theoretic lens can “see” the effect of symmetries of a system. For this purpose, we propose a novel variant of the Information Bottleneck prin- ciple, which has served as a productive basis for many principled studies of learning and information-constrained adaptive behaviour. We show (in the discrete case) that our ap- proach formalises a certain duality between symmetry and information parsimony: namely, channel equivariances can be characterised by the optimal mutual information-preserving joint compression of the channel’s input and output. This information-theoretic treatment furthermore suggests a principled notion of “soft” equivariance, whose “coarseness” is mea- sured by the amount of input-output mutual information preserved by the corresponding optimal compression. This new notion offers a bridge between the field of bounded ratio- nality and the study of symmetries in neural representations. The framework may also allow (exact and soft) equivariances to be automatically discovered.Peer reviewe

    Similar works