45 research outputs found

    Modeling Substitution Errors in Spanish Morphology Learning

    No full text

    A Computational Model of the Acquisition of Mental State Verbs

    No full text
    Children first use verbs to refer to inner mental states, such as thoughts and beliefs, only towards their 4th birthday. Presenting this ability serves as important evidence of the stage of cognitive development of a child since it entails an ability to conceptualize mental states. This developmental milestone is necessary for children to engage in social interaction since appropriate interaction relies on directing the interaction and anticipate actions surrounding them given the mental states of others. Thus far, psycholinguistic models have been widely focused on the development of cognitive and linguistic skills required to enable the expression of mental states by children. However, while computational models have proved to be a useful tool in studying the developmental trajectory of similar language acquisition phenomena, no computational model to our knowledge has been used to analyze the linguistic development required for the expression of mental states. In this thesis, I present a computational model used to analyze the way linguistic development plays a role in learning to express mental state verbs. The computational model in this thesis offers an integrated framework that simultaneously models several of the cognitive and linguistic factors in the acquisition of mental state verbs. The cognitive factor simulates the difficulty in attending to mental states, while the linguistic properties represent the typical semantic and syntactic properties of use of mental state verbs, e.g., reference to mental states using a sentential complement syntactic structure. The experimental work in this thesis replicates psycholinguistic observations from the child acquisition of mental state verbs. The results of these experiments shed light on the facilitating role of certain linguistic properties of mental state verbs in learning to verbally refer to mental meanings. Importantly, I achieve these results within the context of naturalistic language input that mimics the complexity and diversity of use of mental state verbs and verbs from additional semantic classes in child-directed speech. Finally, I present a novel extension to an existing computational model of verb argument structure learning that enables the simultaneous and incremental learning of verb classes. The novel model offers a probabilistic framework to analyze monotonically growing verb classes in comparison to previously offered batch models that limit the capabilities of the computational simulation. Moreover, this model gives support to the importance of an additional linguistic property of mental state verb, i.e., the role of their use with syntactic structures other than the well-studied co-occurrence with sentential complement syntax in child-directed speech. Finally, I show the contribution of this novel computational component in comparison with previous computational models in a wider context of argument structure learning.Ph.D

    Text Categorization from Category Name via Lexical Reference

    No full text
    Requiring only category names as user input is a highly attractive, yet hardly explored, setting for text categorization. Earlier bootstrapping results relied on similarity in LSA space, which captures rather coarse contextual similarity. We suggest improving this scheme by identifying concrete references to the category name’s meaning, obtaining a special variant of lexical expansion.

    Gradual Acquisition of Mental State Meaning: A Computational Investigation

    No full text
    The acquisition of Mental State Verbs (MSVs) has been exten-sively studied in respect to their common occurrence with sen-tential complement syntax. However, MSVs also occur in a va-riety of other syntactic structures. Moreover, other verb classes frequently occur with sentential complements, e.g., Communi-cation and Perception verbs. The similarity in distribution of the various verb classes over syntactic patterns may affect the acquisition of the meaning of MSVs by association. In this study we present a novel computational model to learn verb classes, which allows us to analyze the association of men-tal verbs to their meaning over a variety of syntactic patterns. Our results point to an important role of the full syntactic pref-erences of MSVs on top of their occurrences with sentential complements

    Extracting lexical reference rules from Wikipedia

    No full text
    This paper describes the extraction from Wikipedia of lexical reference rules, identifying references to term meanings triggered by other terms. We present extraction methods geared to cover the broad range of the lexical reference relation and analyze them extensively. Most extraction methods yield high precision levels, and our rule-base is shown to perform better than other automatically constructed baselines in a couple of lexical expansion and matching tasks. Our rule-base yields comparable performance to Word-Net while providing largely complementary information.

    Learning Verb Classes in an Incremental Model

    No full text
    The ability of children to generalize over the linguistic input they receive is key to acquiring productive knowledge of verbs. Such generalizations help children extend their learned knowledge of constructions to a novel verb, and use it appropriately in syntactic patterns previously unobserved for that verb—a key factor in language productivity. Computational models can help shed light on the gradual development of more abstract knowledge during verb acquisition. We present an incremental Bayesian model that simultaneously and incrementally learns argument structure constructions and verb classes given nat-uralistic language input. We show how the distributional properties in the input lan-guage influence the formation of general-izations over the constructions and classes.

    Modeling the Emergence of an Exemplar Verb in Construction Learning

    No full text
    Using a computational model of verb argument structure learn-ing, we study a key assumption of the usage-based theory: that the acquisition of a construction relies heavily on the existence of a high-frequency exemplar verb that accounts for a large proportion of usages of that construction in the input. Im-portantly, unlike the psycholinguistic experiments that focus on the learning of an artificial novel construction using novel verbs, here we examine the acquisition of the English sen-tential complement construction from naturalistic input. Our results provide new insights into exemplar-based learning in the context of naturalistic input with multiple semantic classes, and a diverse set of constructions for the verbs

    Acquisition of Desires before Beliefs: A Computational Investigation

    No full text
    The acquisition of Belief verbs lags behind the acquisition of Desire verbs in children. Some psycholinguistic theories attribute this lag to conceptual differences between the two classes, while others suggest that syntactic differences are responsible. Through computational experiments, we show that a probabilistic verb learning model exhibits the pattern of acquisition, even though there is no difference in the model in the difficulty of the semantic or syntactic properties of Belief vs. Desire verbs. Our results point to the distributional properties of various verb classes as a potentially important, and heretofore unexplored, factor in the observed developmental lag of Belief verbs.
    corecore