99,573 research outputs found

    Auditory Emotion Word Primes Influence Emotional Face Categorization in Children and Adults, but Not Vice Versa

    Get PDF
    In order to assess how the perception of audible speech and facial expressions influence one another for the perception of emotions, and how this influence might change over the course of development, we conducted two cross-modal priming experiments with 3 age groups of children (6-, 9-, and 12-years old), as well as college-aged adults. In experiment 1, 74 children and 24 adult participants were tasked with categorizing photographs of emotional faces as positive or negative as quickly as possible after being primed with emotion words presented via audio in valence-congruent and valence-incongruent trials. In experiment 2, 67 children and 24 adult participants carried out a similar categorization task, but with faces acting as visual primes, and emotion words acting as auditory targets. The results of experiment 1 showed that participants made more errors when categorizing positive faces primed by negative words versus positive words, and that six-year-old children are particularly sensitive to positive word primes, giving faster correct responses regardless of target valence. Meanwhile, the results of experiment 2 did not show any congruency effects for priming by facial expressions. Thus, audible emotion words seem to exert an influence on the emotional categorization of faces, while faces do not seem to influence the categorization of emotion words in a significant way

    Constructing Emotion Categorization: Insights From Developmental Psychology Applied to a Young Adult Sample

    Get PDF
    Previous research has found that the categorization of emotional facial expressions is influenced by a variety of factors, such as processing time, facial mimicry, emotion labels, and perceptual cues. However, past research has frequently confounded these factors, making it impossible to ascertain how adults use this varied information to categorize emotions. The current study is the first to explore the magnitude of impact for each of these factors on emotion categorization in the same paradigm. Participants (N = 102) categorized anger and disgust emotional facial expressions in a novel computerized task, modeled on similar tasks in the developmental literature with preverbal infants. Experimental conditions manipulated (a) whether the task was time-restricted, and (b) whether the labels "anger" and "disgust" were used in the instructions. Participants were significantly more accurate when provided with unlimited response time and emotion labels. Participants who were given restricted sorting time (2s) and no emotion labels tended to focus on perceptual features of the faces when categorizing the emotions, which led to low sorting accuracy. In addition, facial mimicry related to greater sorting accuracy. These results suggest that when high-level (labeling) categorization strategies are unavailable, adults use low-level (perceptual) strategies to categorize facial expressions. Methodological implications for the study of emotion are discussed

    CATEGORIZATION OF EMOTION VERBS IN BAHASA INDONESIA

    Get PDF
    This article discusses the categorization of emotion verbs in bahasa Indonesia. As the analytical tools,semantic primes from the theory of Natural Semantic Metalanguage were used. The research data werecollected through observation and interview methods. The data were analyzed using identity anddistribution methods and the results of data analysis were formally and informally presented. The resultshowed that emotion verbs can be divided into stative emotion verbs (SEV) characterized by [-controlled,-volition] and active emotion verbs (AEV) characterized by [+controlled, +volition]. The SEV scenario is‘X felt something, NOT BECAUSE X WANTED IT’ and the AEV scenario is ‘X felt something BECAUSE X WASSAYING TO HIM/HERSELF THINGS WHICH COULD CAUSE ONE TO FEEL IT’. Furthermore, SEV is divided intosubcategories: (1) ‘something bad happened’ (SEDIH, SAD), (2) ‘something bad can/will happen’ (TAKUT, FEAR), (3) ‘people can think something bad about me’ (MALU, SHAME), and (4) ‘I do not think that thingslike this can/will happen’ (HERAN, AMAZED). AEV is divided into subcategories: (1) ’something goodhappened’ (SENANG, HAPPY), (2) ‘I think about something’ (SANGSI, DOUBT), (3) ‘I did something bad’(MENYESAL, REMORSE), (4) ‘I think about someone else’ (CINTA, LOVE), and (5) ‘I do not want things likethis to happen) (MARAH, ANGRY)

    Thermal facial reactivity patterns predict social categorization bias triggered by unconscious and conscious emotional stimuli

    Get PDF
    Members of highly social species decode, interpret, and react to the emotion of a conspecific depending on whether the other belongs to the same (ingroup) or different (outgroup) social group. While studies indicate that consciously perceived emotional stimuli drive social categorization, information about how implicit emotional stimuli and specific physiological signatures affect social categorization is lacking. We addressed this issue by exploring whether subliminal and supraliminal affective priming can influence the categorization of neutral faces as ingroup versus outgroup. Functional infrared thermal imaging was used to investigate whether the effect of affective priming on the categorization decision was moderated by the activation of the sympathetic nervous system (SNS). During the subliminal condition, we found that stronger SNS activation after positive or negative affective primes induced ingroup and outgroup face categorization, respectively. The exact opposite pattern (i.e. outgroup after positive and ingroup after negative primes) was observed in the supraliminal condition. We also found that misattribution effects were stronger in people with low emotional awareness, suggesting that this trait moderates how one recognizes SNS signals and employs them for unrelated decisions. Our results allow the remarkable implication that low-level affective reactions coupled with sympathetic activation may bias social categorization

    Space-by-time manifold representation of dynamic facial expressions for emotion categorization

    Get PDF
    Visual categorization is the brain computation that reduces high-dimensional information in the visual environment into a smaller set of meaningful categories. An important problem in visual neuroscience is to identify the visual information that the brain must represent and then use to categorize visual inputs. Here we introduce a new mathematical formalism—termed space-by-time manifold decomposition—that describes this information as a low-dimensional manifold separable in space and time. We use this decomposition to characterize the representations used by observers to categorize the six classic facial expressions of emotion (happy, surprise, fear, disgust, anger, and sad). By means of a Generative Face Grammar, we presented random dynamic facial movements on each experimental trial and used subjective human perception to identify the facial movements that correlate with each emotion category. When the random movements projected onto the categorization manifold region corresponding to one of the emotion categories, observers categorized the stimulus accordingly; otherwise they selected “other.” Using this information, we determined both the Action Unit and temporal components whose linear combinations lead to reliable categorization of each emotion. In a validation experiment, we confirmed the psychological validity of the resulting space-by-time manifold representation. Finally, we demonstrated the importance of temporal sequencing for accurate emotion categorization and identified the temporal dynamics of Action Unit components that cause typical confusions between specific emotions (e.g., fear and surprise) as well as those resolving these confusions

    Situating emotional experience

    Get PDF
    Psychological construction approaches to emotion suggest that emotional experience is situated and dynamic. Fear, for example, is typically studied in a physical danger context (e.g., threatening snake), but in the real world, it often occurs in social contexts, especially those involving social evaluation (e.g., public speaking). Understanding situated emotional experience is critical because adaptive responding is guided by situational context (e.g., inferring the intention of another in a social evaluation situation vs. monitoring the environment in a physical danger situation). In an fMRI study, we assessed situated emotional experience using a newly developed paradigm in which participants vividly imagine different scenarios from a first-person perspective, in this case scenarios involving either social evaluation or physical danger. We hypothesized that distributed neural patterns would underlie immersion in social evaluation and physical danger situations, with shared activity patterns across both situations in multiple sensory modalities and in circuitry involved in integrating salient sensory information, and with unique activity patterns for each situation type in coordinated large-scale networks that reflect situated responding. More specifically, we predicted that networks underlying the social inference and mentalizing involved in responding to a social threat (in regions that make up the “default mode” network) would be reliably more active during social evaluation situations. In contrast, networks underlying the visuospatial attention and action planning involved in responding to a physical threat would be reliably more active during physical danger situations. The results supported these hypotheses. In line with emerging psychological construction approaches, the findings suggest that coordinated brain networks offer a systematic way to interpret the distributed patterns that underlie the diverse situational contexts characterizing emotional life

    Emotion regulation and emotion work: two sides of the same coin?

    Get PDF
    This contribution links psychological models of emotion regulation to sociological accounts of emotion work to demonstrate the extent to which emotion regulation is systematically shaped by culture and society. I first discuss a well-established two-factor process model of emotion regulation and argue that a substantial proportion of emotion regulatory goals are derived from emotion norms. In contrast to universal emotion values and hedonic preferences, emotion norms are highly specific to social situations and institutional contexts. This specificity is determined by social cognitive processes of categorization and guided by framing rules. Second, I argue that the possibilities for antecedent-focused regulation, in particular situation selection and modification, are not arbitrarily available to individuals. Instead, they depend on economic, cultural, and social resources. I suggest that the systematic and unequal distribution of these resources in society leads to discernible patterns of emotion and emotion regulation across groups of individuals

    Effects in the affect misattribution procedure are modulated by feature-specific attention allocation

    Get PDF
    We examined whether automatic stimulus evaluation as measured by the Affect Misattribution Procedure (AMP) is moderated by the degree to which attention is assigned to the evaluative stimulus dimension (i.e., feature-specific attention allocation, FSAA). In two experiments, one group of participants completed a standard AMP while attending to evaluative stimulus information. A second group of participants completed the AMP while attending to non-evaluative stimulus information. In line with earlier work, larger AMP effects were observed when participants were encouraged to attend to evaluative stimulus information than when they were not. These observations support the idea that the impact of FSAA on measures of automatic stimulus evaluation results from a genuine change in the degree of automatic stimulus evaluation rather than a change in the degree to which automatic stimulus evaluation is picked up by these measures

    Appraisal Theories for Emotion Classification in Text

    Full text link
    Automatic emotion categorization has been predominantly formulated as text classification in which textual units are assigned to an emotion from a predefined inventory, for instance following the fundamental emotion classes proposed by Paul Ekman (fear, joy, anger, disgust, sadness, surprise) or Robert Plutchik (adding trust, anticipation). This approach ignores existing psychological theories to some degree, which provide explanations regarding the perception of events. For instance, the description that somebody discovers a snake is associated with fear, based on the appraisal as being an unpleasant and non-controllable situation. This emotion reconstruction is even possible without having access to explicit reports of a subjective feeling (for instance expressing this with the words "I am afraid."). Automatic classification approaches therefore need to learn properties of events as latent variables (for instance that the uncertainty and the mental or physical effort associated with the encounter of a snake leads to fear). With this paper, we propose to make such interpretations of events explicit, following theories of cognitive appraisal of events, and show their potential for emotion classification when being encoded in classification models. Our results show that high quality appraisal dimension assignments in event descriptions lead to an improvement in the classification of discrete emotion categories. We make our corpus of appraisal-annotated emotion-associated event descriptions publicly available.Comment: Accepted at COLING 202
    corecore