370 research outputs found

    Happy Mouth and Sad Eyes : Scanning Emotional Facial Expressions

    Full text link
    There is evidence that specific regions of the face such as the eyes are particularly relevant for the decoding of emotional expressions, but it has not been examined whether scan paths of observers vary for facial expressions with different emotional content. In this study, eye-tracking was used to monitor scanning behavior of healthy participants while looking at different facial expressions. Locations of fixations and their durations were recorded, and a dominance ratio (i.e., eyes and mouth relative to the rest of the face) was calculated. Across all emotional expressions, initial fixations were most frequently directed to either the eyes or the mouth. Especially in sad facial expressions, participants more frequently issued the initial fixation to the eyes compared with all other expressions. In happy facial expressions, participants fixated the mouth region for a longer time across all trials. For fearful and neutral facial expressions, the dominance ratio indicated that both the eyes and mouth are equally important. However, in sad and angry facial expressions, the eyes received more attention than the mouth. These results confirm the relevance of the eyes and mouth in emotional decoding, but they also demonstrate that not all facial expressions with different emotional content are decoded equally. Our data suggest that people look at regions that are most characteristic for each emotion

    DETECTING PANIC POTENTIAL IN SOCIAL MEDIA TWEETS

    Get PDF
    A high degree of real-time interconnectedness can aid information transmission, particularly in disaster situations. However, it can have substantial negative consequences when information is emotionally laden and transmits these emotions, particularly the emotion of panic, to the individual across social media in an already grave situation. Prior research has shown that information laden with emotion spreads through social network faster than otherwise. Hence, we highlight the need to understand and curtail potentially panic-causing information, without compromising on good quality information from being available for effective crisis communication and management. With this research, we present the necessity of detecting the panic potential of social media messages, and aim to address two research questions: What are the features, and metrics necessary, to compute and evaluate the panic potential of a social media message (respectively)? Our planned analysis takes the case of the Munich shooting incident, 2016, based on user tweets immediately after the incident. Different features and evaluation metrics are proposed and discussed. The work aims to detect panic potential of messages in social media networks during disasters

    How are you getting by? Coping in developmental coordination disorder versus attention-deficit/hyperactivity disorder

    Get PDF
    Objective. Developmental Coordination Disorder and Attention-Deficit/Hyperactivity. Disorder commonly persist into adulthood, however, little research exists to describe how adults with DCD and/or ADHD cope with symptoms. Therefore, the purpose of this study was to investigate coping mechanisms reported by adults with DCD, ADHD, and both conditions. We expected there would be strategies specific to each condition and a broader scope of mechanisms reported by those with cooccurring DCD+ADHD. Method. N=161 participants completed the online survey, including n=31 with DCD only, n=116 with ADHD only, and n=14 with DCD+ADHD. Results. Most participants reported adaptive strategies. Of these, behavioral adaptations were most relevant to ADHD, while environmental modifications were common in DCD. Cognitive reframing and social support were similarly relevant to those with DCD and DCD+ADHD. Coping strategy categories were most uniform for the DCD+ADHD group. Conclusions. Coping profiles highlight several noteworthy differences between DCD and ADHD relevant for treatment

    Motor-Incompatibility of Facial Reactions : The influence of valence and stimulus content on voluntary facial reactions

    Get PDF
    Emotional cues facilitate motor responses that are associated with approach or avoidance. Previous research has shown that evaluative processing of positive and negative facial expression stimuli is also linked to motor schemata of facial muscles. To further investigate the influence of different types of emotional stimuli on facial reactions, we conducted a study with pictures of emotional facial expressions (KDEF) and scenes (IAPS). Healthy participants were asked to respond to the positive or negative facial expressions (KDEF) and scenes (IAPS) with specific facial muscles in a valence-congruent (stimulus valence matches muscle related valence) or a valence-incongruent condition (stimulus valence is contrary to muscle related valence). Additionally, they were asked to rate pictures in terms of valence and arousal. Muscular response latencies were recorded by an electromyogram. Overall, response latencies were shorter in response to facial expressions than to complex pictures of scenes. For both stimulus categories, response latencies with valence-compatible muscles were shorter compared to reactions with incompatible muscles. Moreover, correlations between picture ratings and facial muscle reactions for happy facial expressions as well as positive scenes reflect a direct relationship between perceived intensity of the subjective emotional experience and physiological responding. Results replicate and extend previous research, indicating that incompatibility effects are reliable across different stimulus types and are not limited to facial mimicry

    How do individuals with Developmental Coordination Disorder and Attention Deficit Hyperactivity Disorder cope with their symptoms?

    Get PDF

    Data from the paper: Valence and Arosual: A comparison of two sets of Emotional Facial Expressions

    Get PDF
    Supplementary material to our publication: Adolph, D. & Alpers, G.W. (2010). Valence and Arosual: A comparison of two sets of Emotional Facial Expressions. Amer J Psychol, 123: 209-219. Supplementary material in MADATA contains: Table 1: Valence and Arousal Ratings in response to Nim-Stim pics (Tottenham, N., Tanaka, J. W., Leon, A. C., McCarry, T., Nurse, M., Hare, T. A., Marcus, D. J., Westerlund, A., Casey, B. J., & Nelson, C. (2009). The NimStim set of facial expressions: Judgments from untrained research participants. Psychiatry Research, 168, 242-249.); Means (+ SD) of Valence and Arousal Ratings. Table 2: Valence and Arousal Ratings in response to KDEF pics (Lundqvist, D., Flykt, A., & Ă–hman, A. (1998). Karolinska Directed Emotional Faces. Stockholm, Sweden: Department of Neurosciences, Karolinska Hospital); Means (+ SD) of Valence and Arousal Rating

    Schauspielpatienten im universitären Gesprächsführungspraktikum

    Get PDF

    Missing out: Depressed patients avoid functional risk-taking in the Balloon Analogue Risk Task

    Get PDF
    • …
    corecore