14 research outputs found

    Noun-Noun Compound Comprehension As Self-Organization: The Representation And Processing Dynamics Of Noun-Noun Compounds

    Get PDF
    By noun-noun compound, we mean any combination of two nouns that native speakers of a language can understand. Native speakers can easily generate and understand novel, transparent compounds (e.g., mountain magazine), suggesting compositional processing. Relation-based theories of compound meaning (e.g., Levi, 1978) provide an explanation for this apparent productivity by assuming a set of semantic/thematic relations (e.g., ABOUT) that can bind two component nouns. Inspired by relation-based theories, we propose a self-organizing network model according to which (1) the meaning of a noun-noun compound M H (e.g., mountain magazine) corresponds to a tree-like constituent structure whose daughter nodes represent constituent meanings and whose mother node represents a relation R such as [R M H], (2) each of the mother node and the daughter nodes is represented as a vector in a similarity space (in which similar relations are placed close together) that consists of multiple units (each of which is not necessarily interpretable) associated with activation levels, (3) connection strengths between units are assumed to be learned from linguistic experience, and (4) a particular structure (e.g., [ABOUT mountain magazine]) is realized from the interactive activation between the two groups of constituent units via the relational units. The model integrates prior efforts to model processing difficulty (GagnĂ© & Shoben, 1997) and relational similarity (Devereux & Costello, 2006) to provide a comprehensive understanding of compound representation and processing dynamics. Furthermore, the model is applied to opaque compounds (e.g., seahorse) that have idiosyncratic meanings by assuming that they are represented in the same space as transparent compounds. We describe a free card sorting study that suggests that the relation space is a similarity space and hierarchically organized. In Experiment 1, we demonstrate that two kinds of interpretation revision phenomena, known from the sentence processing literature—garden path and local coherence effects—both occur in compound processing, suggesting an interactive constraint-satisfaction process. In Experiment 2, we observed positive priming between two transparent compounds that instantiate similar relations. In Experiment 3, we observed negative priming between an opaque and a transparent compound (and vice versa) regardless of relational similarity. Results in Experiment 2 and 3 together suggest that processing difficulty is not simply correlated with distance in similarity space: processing dynamics on the space must be considered. We report two simulation studies to explain experimental data and discuss why complex dynamics is observed in the relation similarity space. We contrast our model with symbolic treatments of the relation between regular and exceptional morphology (Pinker, 1999) and other similarity space models (Devereux & Costello, 2006), arguing that the critical distinguishing property of the dynamical models is the inclusion of feedback dynamics

    Parallel parsing in a Gradient Symbolic Computation parser

    No full text
    In many cognitive domains, comprehenders construct structured, discrete representations of the environment. Because information is distributed over time, and partial information may not unambiguously identify a single representation, multiple possible structures must be maintained during incremental comprehension. How can the continuous-time, continuous-state neural cognitive system address these challenges? We propose a neural network approach, building on previous research in the Gradient Symbolic Computation framework in the domain of sentence processing. We introduce brick roles, a neurally- plausible, scalable distributed representation encoding binary tree structures. The appropriate structure is computed via an optimization process implementing a probabilistic context-free grammar. In the face of structural uncertainty encountered during incremental parsing, optimization yields conjunctive blends: states where multiple possible structures are simultaneously present (vs. disjunctive representations such as probabilistic mixtures). The degree of blending is controlled via a commitment parameter which drives local parsing decisions. We introduce a novel training algorithm for learning optimization parameters, and an improved policy for controlling commitment over a range of grammars. This provides a computational foundation for developing proposals integrating continuous and discrete aspects of sentence processing

    PIPS: Parallelism in Planning Syntax

    No full text

    Converging evidence: Network structure effects on conventionalization of gestural referring expressions

    No full text
    New languages emerge through interactions among people, yet the role of social network structure in language emergence is not clear, despite research from experimental semiotics, observational fieldwork, and computational modeling. To better understand the effects of social network structure on the formation of conventional referring expressions, we use a silent gesture paradigm that combines the methodological control of experimental semiotics and computational simulations with the naturalistic affordances of the human body, physical environment, and interpersonal communication. We elicited gestural referring expressions from hearing participants randomly assigned to either a richly- or sparsely-connected communicative network. Results demonstrated greater conventionalization among participants in the richly-connected condition, although this effect disappears after accounting for between-condition differences in overall number of communicative interactions. These results provide the first experimental demonstration that communicative network structure causally impacts the conventionalization of referring expressions in human participants, using a communicative modality in which human language naturally arises
    corecore