3 research outputs found

    Linking language and emotion: how emotion is understood in language comprehension, production and prediction using psycholinguistic methods

    Get PDF
    Emotions are an integral part of why and how we use language in everyday life. We communicate our concerns, express our woes, and share our joy through the use of non-verbal and verbal language. Yet there is a limited understanding of when and how emotional language is processed differently to neutral language, or of how emotional information facilitates or inhibits language processing. Indeed, various efforts have been made to bring back emotions into the discipline of psycholinguistics in the last decade. This can be seen in many interdisciplinary models focusing on the role played by emotion in each aspect of linguistic experience. In this thesis, I answer this call and pursue questions that remain unanswered in psycholinguistics regarding its interaction with emotion. The general trend that I am using to bring emotion into psycholinguistic research is straightforward. Where applicable and relevant, I use well-established tasks or paradigms to investigate the effects of emotional content in language processing. Hence, I focused on three main areas of language processing: comprehension, production and prediction. The first experimental chapter includes a series of experiments utilising the Modality Switching Paradigm to investigate whether sentences describing emotional states are processed differently from sentences describing cognitive states. No switching effects were found consistently in my 3 experiments. My results suggest that these distinct classes of interoceptive concepts, such as ‘thinking’ or ‘being happy’, are not processed differently from each other, suggesting that people do not switch attention between different interoceptive systems when comprehending emotional or cognitive sentences. I discuss the implications for grounded cognition theory in the embodiment literature. In my second experimental chapter, I used the Cumulative Semantic Interference Paradigm to investigate these two questions: (1) whether emotion concepts interfere with one another when repeatedly retrieved (emotion label objects), and (2) whether similar interference occurs for concrete objects that share similar valence association (emotion-laden objects). This could indicate that people use information such as valence and arousal to group objects in semantic memory. I found that interference occurs when people retrieve direct emotion labels repeatedly (e.g., “happy” and “sad”) but not when they retrieve the names of concrete objects that have similar emotion connotations (e.g., “puppy” and “rainbow”). I discuss my findings in terms of the different types of information that support representation of abstract vs. concrete concepts. In my final experimental chapter, I used the Visual World Paradigm to investigate whether the emotional state of an agent is used to inform predictions during sentence processing. I found that people do use the description of emotional state of an agent (e.g., “The boy is happy”) to predict the cause of that affective state during sentence processing (e.g., “because he was given an ice-cream”). A key result here is that people were more likely to fixate on the emotionally congruent objects (e.g., ice-cream) compared to incongruent objects (e.g., broccoli). This suggests that people rapidly and automatically inform predictions about upcoming sentence information based on the emotional state of the agent. I discuss our findings as a novel contribution to the Visual World literature. I conducted a diverse set of experiments using a range of established psycholinguistic methods to investigate the roles of emotional information in language processing. I found clear results in the eye-tracking study but inconsistent effects in both switching and interference studies. I interpret these mixed findings in the following way: emotional content does not always have effects in language processing and that effect are most likely in tasks that explicitly require participants to simulate emotion states in some way. Regardless, not only was I successful in finding some novel results by extending previous tasks, but I was also able to show that this is an avenue that can be explored more to advance the affective psycholinguistic field

    Embodied Conceptual Combination

    Get PDF
    Conceptual combination research investigates the processes involved in creating new meaning from old referents. It is therefore essential that embodied theories of cognition are able to explain this constructive ability and predict the resultant behavior. However, by failing to take an embodied or grounded view of the conceptual system, existing theories of conceptual combination cannot account for the role of perceptual, motor, and affective information in conceptual combination. In the present paper, we propose the embodied conceptual combination (ECCo) model to address this oversight. In ECCo, conceptual combination is the result of the interaction of the linguistic and simulation systems, such that linguistic distributional information guides or facilitates the combination process, but the new concept is fundamentally a situated, simulated entity. So, for example, a cactus beetle is represented as a multimodal simulation that includes visual (e.g., the shiny appearance of a beetle) and haptic (e.g., the prickliness of the cactus) information, all situated in the broader location of a desert environment under a hot sun, and with (at least for some people) an element of creepy-crawly revulsion. The ECCo theory differentiates interpretations according to whether the constituent concepts are destructively, or non-destructively, combined in the situated simulation. We compare ECCo to other theories of conceptual combination, and discuss how it accounts for classic effects in the literature
    corecore