417 research outputs found

    Effect of parasympathetic stimulation on brain activity during appraisal of fearful expressions

    Get PDF
    Autonomic nervous system activity is an important component of human emotion. Mental processes influence bodily physiology, which in turn feeds back to influence thoughts and feelings. Afferent cardiovascular signals from arterial baroreceptors in the carotid sinuses are processed within the brain and contribute to this two-way communication with the body. These carotid baroreceptors can be stimulated non-invasively by externally applying focal negative pressure bilaterally to the neck. In an experiment combining functional neuroimaging (fMRI) with carotid stimulation in healthy participants, we tested the hypothesis that manipulating afferent cardiovascular signals alters the central processing of emotional information (fearful and neutral facial expressions). Carotid stimulation, compared with sham stimulation, broadly attenuated activity across cortical and brainstem regions. Modulation of emotional processing was apparent as a significant expression-by-stimulation interaction within left amygdala, where responses during appraisal of fearful faces were selectively reduced by carotid stimulation. Moreover, activity reductions within insula, amygdala, and hippocampus correlated with the degree of stimulation-evoked change in the explicit emotional ratings of fearful faces. Across participants, individual differences in autonomic state (heart rate variability, a proxy measure of autonomic balance toward parasympathetic activity) predicted the extent to which carotid stimulation influenced neural (amygdala) responses during appraisal and subjective rating of fearful faces. Together our results provide mechanistic insight into the visceral component of emotion by identifying the neural substrates mediating cardiovascular influences on the processing of fear signals, potentially implicating central baroreflex mechanisms for anxiolytic treatment targets

    User experience evaluation of human representation in collaborative virtual environments

    Get PDF
    Human embodiment/representation in virtual environments (VEs) similarly to the human body in real life is endowed with multimodal input/output capabilities that convey multiform messages enabling communication, interaction and collaboration in VEs. This paper assesses how effectively different types of virtual human (VH) artefacts enable smooth communication and interaction in VEs. With special focus on the REal and Virtual Engagement In Realistic Immersive Environments (REVERIE) multi-modal immersive system prototype, a research project funded by the European Commission Seventh Framework Programme (FP7/2007-2013), the paper evaluates the effectiveness of REVERIE VH representation on the foregoing issues based on two specifically designed use cases and through the lens of a set of design guidelines generated by previous extensive empirical user-centred research. The impact of REVERIE VH representations on the quality of user experience (UX) is evaluated through field trials. The output of the current study proposes directions for improving human representation in collaborative virtual environments (CVEs) as an extrapolation of lessons learned by the evaluation of REVERIE VH representation

    Reading faces: differential lateral gaze bias in processing canine and human facial expressions in dogs and 4-year-old children

    Get PDF
    Sensitivity to the emotions of others provides clear biological advantages. However, in the case of heterospecific relationships, such as that existing between dogs and humans, there are additional challenges since some elements of the expression of emotions are species-specific. Given that faces provide important visual cues for communicating emotional state in both humans and dogs, and that processing of emotions is subject to brain lateralisation, we investigated lateral gaze bias in adult dogs when presented with pictures of expressive human and dog faces. Our analysis revealed clear differences in laterality of eye movements in dogs towards conspecific faces according to the emotional valence of the expressions. Differences were also found towards human faces, but to a lesser extent. For comparative purpose, a similar experiment was also run with 4-year-old children and it was observed that they showed differential processing of facial expressions compared to dogs, suggesting a species-dependent engagement of the right or left hemisphere in processing emotions

    Darwin's Duchenne: Eye constriction during infant joy and distress

    Get PDF
    Darwin proposed that smiles with eye constriction (Duchenne smiles) index strong positive emotion in infants, while cry-faces with eye constriction index strong negative emotion. Research has supported Darwin's proposal with respect to smiling, but there has been little parallel research on cry-faces (open-mouth expressions with lateral lip stretching). To investigate the possibility that eye constriction indexes the affective intensity of positive and negative emotions, we first conducted the Face-to-Face/Still-Face (FFSF) procedure at 6 months. In the FFSF, three minutes of naturalistic infant-parent play interaction (which elicits more smiles than cry-faces) are followed by two minutes in which the parent holds an unresponsive still-face (which elicits more cry-faces than smiles). Consistent with Darwin's proposal, eye constriction was associated with stronger smiling and with stronger cry-faces. In addition, the proportion of smiles with eye constriction was higher during the positive-emotion eliciting play episode than during the still-face. In parallel, the proportion of cry-faces with eye constriction was higher during the negative-emotion eliciting still-face than during play. These results are consonant with the hypothesis that eye constriction indexes the affective intensity of both positive and negative facial configurations. A preponderance of eye constriction during cry-faces was observed in a second elicitor of intense negative emotion, vaccination injections, at both 6 and 12 months of age. The results support the existence of a Duchenne distress expression that parallels the more well-known Duchenne smile. This suggests that eye constriction-the Duchenne marker-has a systematic association with early facial expressions of intense negative and positive emotion. © 2013 Mattson et al

    “Avoiding or approaching eyes”? Introversion/extraversion affects the gaze-cueing effect

    Get PDF
    We investigated whether the extra-/introversion personality dimension can influence processing of others’ eye gaze direction and emotional facial expression during a target detection task. On the basis of previous evidence showing that self-reported trait anxiety can affect gaze-cueing with emotional faces, we also verified whether trait anxiety can modulate the influence of intro-/extraversion on behavioral performance. Fearful, happy, angry or neutral faces, with either direct or averted gaze, were presented before the target appeared in spatial locations congruent or incongruent with stimuli’s eye gaze direction. Results showed a significant influence of intra-/extraversion dimension on gaze-cueing effect for angry, happy, and neutral faces with averted gaze. Introverts did not show the gaze congruency effect when viewing angry expressions, but did so with happy and neutral faces; extraverts showed the opposite pattern. Importantly, the influence of intro-/extraversion on gaze-cueing was not mediated by trait anxiety. These findings demonstrated that personality differences can shape processing of interactions between relevant social signals

    Gender and the Communication of Emotion Via Touch

    Get PDF
    We reanalyzed a data set consisting of a U.S. undergraduate sample (N = 212) from a previous study (Hertenstein et al. 2006a) that showed that touch communicates distinct emotions between humans. In the current reanalysis, we found that anger was communicated at greater-than-chance levels only when a male comprised at least one member of a communicating dyad. Sympathy was communicated at greater-than-chance levels only when a female comprised at least one member of the dyad. Finally, happiness was communicated only if females comprised the entire dyad. The current analysis demonstrates gender asymmetries in the accuracy of communicating distinct emotions via touch between humans

    Associating Facial Expressions and Upper-Body Gestures with Learning Tasks for Enhancing Intelligent Tutoring Systems

    Get PDF
    Learning involves a substantial amount of cognitive, social and emotional states. Therefore, recognizing and understanding these states in the context of learning is key in designing informed interventions and addressing the needs of the individual student to provide personalized education. In this paper, we explore the automatic detection of learner’s nonverbal behaviors involving hand-over-face gestures, head and eye movements and emotions via facial expressions during learning. The proposed computer vision-based behavior monitoring method uses a low-cost webcam and can easily be integrated with modern tutoring technologies. We investigate these behaviors in-depth over time in a classroom session of 40 minutes involving reading and problem-solving exercises. The exercises in the sessions are divided into three categories: an easy, medium and difficult topic within the context of undergraduate computer science. We found that there is a significant increase in head and eye movements as time progresses, as well as with the increase of difficulty level. We demonstrated that there is a considerable occurrence of hand-over-face gestures (on average 21.35%) during the 40 minutes session and is unexplored in the education domain. We propose a novel deep learning approach for automatic detection of hand-over-face gestures in images with a classification accuracy of 86.87%. There is a prominent increase in hand-over-face gestures when the difficulty level of the given exercise increases. The hand-over-face gestures occur more frequently during problem-solving (easy 23.79%, medium 19.84% and difficult 30.46%) exercises in comparison to reading (easy 16.20%, medium 20.06% and difficult 20.18%)
    corecore