Languages are powerful solutions to coordination problems: they provide
stable, shared expectations about how the words we say correspond to the
beliefs and intentions in our heads. Yet language use in a variable and
non-stationary social environment requires linguistic representations to be
flexible: old words acquire new ad hoc or partner-specific meanings on the fly.
In this paper, we introduce CHAI (Continual Hierarchical Adaptation through
Inference), a hierarchical Bayesian theory of coordination and convention
formation that aims to reconcile the long-standing tension between these two
basic observations. We argue that the central computational problem of
communication is not simply transmission, as in classical formulations, but
continual learning and adaptation over multiple timescales. Partner-specific
common ground quickly emerges from social inferences within dyadic
interactions, while community-wide social conventions are stable priors that
have been abstracted away from interactions with multiple partners. We present
new empirical data alongside simulations showing how our model provides a
computational foundation for several phenomena that have posed a challenge for
previous accounts: (1) the convergence to more efficient referring expressions
across repeated interaction with the same partner, (2) the gradual transfer of
partner-specific common ground to strangers, and (3) the influence of
communicative context on which conventions eventually form.Comment: In press at Psychological Revie