38 research outputs found

    Towards the Emergence of Non-trivial Compositionality

    Get PDF
    All natural languages exhibit a distinction between content words (nouns, verbs,etc.) and function words (determiners, auxiliaries, tenses, etc.). Yet surprisingly little has been said about the emergence of this universal architectural feature of human language. This paper argues that the existence of this distinction requires the presence of non-trivial compositionality and identifies assumptions that have previously been made in the literature that provably guarantee only trivial composition. It then presents a signaling game with variable contexts and shows how the distinction can emerge via reinforcement learning

    Towards the Emergence of Non-trivial Compositionality

    Get PDF
    All natural languages exhibit a distinction between content words (nouns, verbs,etc.) and function words (determiners, auxiliaries, tenses, etc.). Yet surprisingly little has been said about the emergence of this universal architectural feature of human language. This paper argues that the existence of this distinction requires the presence of non-trivial compositionality and identifies assumptions that have previously been made in the literature that provably guarantee only trivial composition. It then presents a signaling game with variable contexts and shows how the distinction can emerge via reinforcement learning

    Function Words and Context Variability

    Get PDF
    Natural language expressions fall into two categories: content and function words. While function words are essential to compositional semantics, surprisingly little has been said about their emergence. In this paper, I will show that most extant approaches to the emergence of compositional signaling fail to account for the emergence of functional vocabulary. After providing a result that explains why this is so,, I will present a model and simulation results exhibiting conditions under which such vocabulary can emerge from simple learning dynamics. This model captures the intuition that function words help aid communication with a limited vocabulary in the presence of contextual variability

    Paying Attention to Function Words

    Get PDF
    All natural languages exhibit a distinction between content words (like nouns and adjectives) and function words (like determiners, auxiliaries, prepositions). Yet surprisingly little has been said about the emergence of this universal architectural feature of natural languages. Why have human languages evolved to exhibit this division of labor between content and function words? How could such a distinction have emerged in the first place? This paper takes steps towards answering these questions by showing how the distinction can emerge through reinforcement learning in agents playing a signaling game across contexts which contain multiple objects that possess multiple perceptually salient gradable properties.Comment: Emergent Communication Workshop @ NeurIPS 201

    Evaluating Transformer's Ability to Learn Mildly Context-Sensitive Languages

    Full text link
    Despite that Transformers perform well in NLP tasks, recent studies suggest that self-attention is theoretically limited in learning even some regular and context-free languages. These findings motivated us to think about their implications in modeling natural language, which is hypothesized to be mildly context-sensitive. We test Transformer's ability to learn a variety of mildly context-sensitive languages of varying complexities, and find that they generalize well to unseen in-distribution data, but their ability to extrapolate to longer strings is worse than that of LSTMs. Our analyses show that the learned self-attention patterns and representations modeled dependency relations and demonstrated counting behavior, which may have helped the models solve the languages

    Modal semantic universals optimize the simplicity/informativeness trade-off

    Get PDF
    The meanings expressed by the world’s languages have been argued to support efficient communication. Evidence for this hypothesis has drawn on cross-linguistic analyses of vocabulary in semantic domains of both content words (e.g. kinship terms (Kemp & Regier 2012); color terms (Regier, Kay & Khetarpal 2007; Zaslavsky, Kemp, Regier & Tishby 2018)) and function words (e.g.quantifiers(Steinert-Threlkeld2021); indefinite pronouns(Deni ́c, Steinert-Threlkeld & Szymanik 2022)) approaching the hypothesis concretely in terms of a trade-off between simplicity and informativeness. We apply the analysis to modals (e.g. can, ought, might). Two proposed universals in this domain from Nauze (2008) and Vander Klok (2013) are used for generating many artificial languages with varying degrees of quasi-naturalness as a proxy for natural data. A computational experiment shows that most of the optimal solutions to the trade-off problem are predicted by Vander Klok; meanwhile, as languages more robustly satisfy Nauze’s universal, they also become more optimal. This suggests that efficient communication is a leading explanation for constraints on modal semantic variation

    Function Words and Context Variability

    Get PDF
    Natural language expressions fall into two categories: content and function words. While function words are essential to compositional semantics, surprisingly little has been said about their emergence. In this paper, I will show that most extant approaches to the emergence of compositional signaling fail to account for the emergence of functional vocabulary. After providing a result that explains why this is so,, I will present a model and simulation results exhibiting conditions under which such vocabulary can emerge from simple learning dynamics. This model captures the intuition that function words help aid communication with a limited vocabulary in the presence of contextual variability

    Function Words and Context Variability

    Get PDF
    Natural language expressions fall into two categories: content and function words. While function words are essential to compositional semantics, surprisingly little has been said about their emergence. In this paper, I will show that most extant approaches to the emergence of compositional signaling fail to account for the emergence of functional vocabulary. After providing a result that explains why this is so,, I will present a model and simulation results exhibiting conditions under which such vocabulary can emerge from simple learning dynamics. This model captures the intuition that function words help aid communication with a limited vocabulary in the presence of contextual variability
    corecore