5,341 research outputs found

    Neural correlates of emotional valence for faces and words

    Get PDF
    : Stimuli with negative emotional valence are especially apt to influence perception and action because of their crucial role in survival, a property that may not be precisely mirrored by positive emotional stimuli of equal intensity. The aim of this study was to identify the neural circuits differentially coding for positive and negative valence in the implicit processing of facial expressions and words, which are among the main ways human beings use to express emotions. Thirty-six healthy subjects took part in an event-related fMRI experiment. We used an implicit emotional processing task with the visual presentation of negative, positive, and neutral faces and words, as primary stimuli. Dynamic Causal Modeling (DCM) of the fMRI data was used to test effective brain connectivity within two different anatomo-functional models, for the processing of words and faces, respectively. In our models, the only areas showing a significant differential response to negative and positive valence across both face and word stimuli were early visual cortices, with faces eliciting stronger activations. For faces, DCM revealed that this effect was mediated by a facilitation of activity in the amygdala by positive faces and in the fusiform face area by negative faces; for words, the effect was mainly imputable to a facilitation of activity in the primary visual cortex by positive words. These findings support a role of early sensory cortices in discriminating the emotional valence of both faces and words, where the effect may be mediated chiefly by the subcortical/limbic visual route for faces, and rely more on the direct thalamic pathway to primary visual cortex for words

    A Role for the Motor System in Binding Abstract Emotional Meaning

    Get PDF
    Sensorimotor areas activate to action- and object-related words, but their role in abstract meaning processing is still debated. Abstract emotion words denoting body internal states are a critical test case because they lack referential links to objects. If actions expressing emotion are crucial for learning correspondences between word forms and emotions, emotion word–evoked activity should emerge in motor brain systems controlling the face and arms, which typically express emotions. To test this hypothesis, we recruited 18 native speakers and used event-related functional magnetic resonance imaging to compare brain activation evoked by abstract emotion words to that by face- and arm-related action words. In addition to limbic regions, emotion words indeed sparked precentral cortex, including body-part–specific areas activated somatotopically by face words or arm words. Control items, including hash mark strings and animal words, failed to activate precentral areas. We conclude that, similar to their role in action word processing, activation of frontocentral motor systems in the dorsal stream reflects the semantic binding of sign and meaning of abstract words denoting emotions and possibly other body internal states

    The development of cross-cultural recognition of vocal emotion during childhood and adolescence

    Get PDF
    Humans have an innate set of emotions recognised universally. However, emotion recognition also depends on socio-cultural rules. Although adults recognise vocal emotions universally, they identify emotions more accurately in their native language. We examined developmental trajectories of universal vocal emotion recognition in children. Eighty native English speakers completed a vocal emotion recognition task in their native language (English) and foreign languages (Spanish, Chinese, and Arabic) expressing anger, happiness, sadness, fear, and neutrality. Emotion recognition was compared across 8-to-10, 11-to-13-year-olds, and adults. Measures of behavioural and emotional problems were also taken. Results showed that although emotion recognition was above chance for all languages, native English speaking children were more accurate in recognising vocal emotions in their native language. There was a larger improvement in recognising vocal emotion from the native language during adolescence. Vocal anger recognition did not improve with age for the non-native languages. This is the first study to demonstrate universality of vocal emotion recognition in children whilst supporting an “in-group advantage” for more accurate recognition in the native language. Findings highlight the role of experience in emotion recognition, have implications for child development in modern multicultural societies and address important theoretical questions about the nature of emotions

    A unified view of lateralized vision

    Get PDF
    Left for the trees, right for the forestThe fact that humans have two brain halves, each with their own specialization, speaks to the imagination. The left ‘language’ brain is commonly known example of brain specialization. Sanne Brederoo, researcher at the RUG, shows that the two brain halves are also strongly specialized for vision.During the past 50 years, many studies were carried out to investigate the specialization of the two brain halves. With her dissertation, Brederoo shows that a number of such so-called specializations are in fact myths. Does this mean that the two halves of the brain perform the exact same tasks? Not quite, as that would be a waste of space. Brederoo convincingly shows that both halves –each with their own specializations– are involved in vision. The left halve is an expert in processing detail and reading words. (Not unexpected, given that words consist of letters: many small details.) The right halve is specialized in seeing the bigger picture and viewing faces. (Again quite understandable, given that we usually view faces as a whole, rather than looking at the nose, lips, or eyes individually.) In sum; both brain halves are active during everyday vision, each with their own specialization.In addition, Brederoo shares a remarkable finding: during the viewing of faces, a number of left-handed people use more of both brain halves instead of just the right one.So 
 when you’re unable to see the forest for the trees, your left brain is working too hard. Then you’d better address your right brain in order to see the bigger picture again

    An investigation into the emotion-cognition interaction and sub-clinical anxiety

    Get PDF
    This thesis combines behavioural and electrophysiological approaches in the study of the emotion-cognition interaction and sub-clinical anxiety. The research questions addressed in this thesis concern, specifically: the impact of emotion on attention; the interplay between attention and emotion in anxiety;and the cognitive construct of affect. Chapter 1 provides an introduction to emotion research, cognitive models of anxiety and motivates the thesis. Chapter 2 investigates whether affective processing is automatic. More specifically, to elucidate whether facilitated processing of threat in anxiety, evidenced by emotion-related ERP modulations, requires attentional resources. It was previously reported that emotional expression effects on ERP waveforms were completely eliminated when attention was directed away from emotional faces to other task-relevant locations (Eimer et al., 2003). However, Bishop et al. (2004) reported that threat-related stimuli can evoke amygdala activity without attentional engagement or conscious awareness in high-anxious but not low-anxious participants. Spatial attention was manipulated using a similar paradigm as Vuilleumier et al. (2001) and Holmes et al. (2003), to investigate the mechanism underlying the threat-related processing bias in anxiety by examining the influence of spatial attention and trait anxiety levels on established ERP modulations by emotional stimuli. Participants were instructed to match two peripheral faces or two peripheral Landolt squares. The Landolt squares task was selected since this is an attentionally demanding task and would likely consume most, if not all, attention resources. The ERP data did not offer support to the claim that affective stimuli are processed during unattended conditions in high-anxious but not low-anxious participants. Rather, it questions whether a preattentive processing bias for emotional faces is specific to heightened anxiety. This is based on the finding of an enhanced LPP response for threat/happy versus neutral faces and an enhanced slow wave for threat versus neutral faces, neither modulated by the focus of attention for both high and low anxiety groups. Chapter 3 investigated the delayed disengagement hypothesis proposed by Fox and colleagues (2001) as the mechanism underlying the threat-related attentional bias in anxiety. This was done by measuring N2pc and LRP latencies while participants performed an adapted version of the spatial cueing task.Stimuli consisted of a central affective image (either a face or IAPS picture, depending on condition) flanked to the left and right by a letter/number pair. Participants had to direct their attention to the left or right of a central affective image to make an orientation judgement of the letter stimulus. It was hypothesised that if threat-related stimuli are able to prolong attentional processing, N2pc onset should be delayed relative to the neutral condition. However, N2pc latency was not modulated by emotional valence of the central image, for either high or low anxiety groups. Thus, this finding does not provide support for the locus of the threat-related bias to the disengage component of attention. Chapter 4 further investigated the pattern of attentional deployment in the threat-related bias in anxiety. This was done by measuring task-switching ability between neutral and emotional tasks using an adapted version of Johnson’s (in press) attentional control capacity for emotional representations (ACCE) task. Participants performed either an emotional judgement or a neutral judgement task on a compound stimulus that consisted of an affective image (either happy versus fearful faces in the faces condition, or positive versus negative IAPS pictures in the IAPS condition) with a word located centrally across the image (real word versus pseudo-word). Participants scoring higher in trait anxiety were faster to switch from a neutral to a threatening mental set. This improved ability to switch attention to the emotional judgement task when threatening faces are presented is in accordance with a hypervigilance theory of anxiety. However, this processing bias for threat in anxiety was only apparent for emotional faces and not affective scenes, despite the fact that pictures depicting aversive threat scenes were used (e.g., violence, mutilation). This is discussed in more detail with respect to the social significance of salient stimuli. Chapter 5 in a pair of experiments sought to investigate how affect is mentally represented and specifically questions whether affect is represented on the basis of a conceptual metaphor linking direction and affect. The data suggest that the vertical position metaphor underlies our understanding of the relatively abstract concept of affect and is implicitly active, where positive equates with ‘upwards’ and negative with ‘downwards’. Metaphor-compatible directional movements were demonstrated to facilitate response latencies, such that participants were relatively faster to make upward responses to positively-evaluated words and downward responses to negatively-evaluated words than to metaphorincompatible stimulus-response mappings. The finding suggests that popular use of linguistic metaphors depicting spatial representation of affect may reflect our underlying cognitive construct of the abstract concept of valence. Chapter 6 summarises the research in the thesis and implications of the present results are discussed, in particular in relation to cognitive models of anxiety. Areas of possible future research are provided

    Final Report to NSF of the Standards for Facial Animation Workshop

    Get PDF
    The human face is an important and complex communication channel. It is a very familiar and sensitive object of human perception. The facial animation field has increased greatly in the past few years as fast computer graphics workstations have made the modeling and real-time animation of hundreds of thousands of polygons affordable and almost commonplace. Many applications have been developed such as teleconferencing, surgery, information assistance systems, games, and entertainment. To solve these different problems, different approaches for both animation control and modeling have been developed

    Similarities and differences in the functional architecture of mother-infant communication in rhesus macaque and British mother-infant dyads

    Get PDF
    Similarly to humans, rhesus macaques engage in mother-infant face-to-face interactions. However, no previous studies have described the naturally occurring structure and development of mother-infant interactions in this population and used a comparative-developmental perspective to directly compare them to the ones reported in humans. Here, we investigate the development of infant communication, and maternal responsiveness in the two groups. We video-recorded mother-infant interactions in both groups in naturalistic settings and analysed them with the same micro-analytic coding scheme. Results show that infant social expressiveness and maternal responsiveness are similarly structured in humans and macaques. Both human and macaque mothers use specific mirroring responses to specific infant social behaviours (modified mirroring to communicative signals, enriched mirroring to affiliative gestures). However, important differences were identified in the development of infant social expressiveness, and in forms of maternal responsiveness, with vocal responses and marking behaviours being predominantly human. Results indicate a common functional architecture of mother-infant communication in humans and monkeys, and contribute to theories concerning the evolution of specific traits of human behaviour

    An investigation into vocal expressions of emotions: the roles of valence, culture, and acoustic factors.

    Get PDF
    This PhD is an investigation of vocal expressions of emotions, mainly focusing on non-verbal sounds such as laughter, cries and sighs. The research examines the roles of categorical and dimensional factors, the contributions of a number of acoustic cues, and the influence of culture. A series of studies established that naive listeners can reliably identify non-verbal vocalisations of positive and negative emotions in forced-choice and rating tasks. Some evidence for underlying dimensions of arousal and valence is found, although each emotion had a discrete expression. The role of acoustic characteristics of the sounds is investigated experimentally and analytically. This work shows that the cues used to identify different emotions vary, although pitch and pitch variation play a central role. The cues used to identify emotions in non-verbal vocalisations differ from the cues used when comprehending speech. An additional set of studies using stimuli consisting of emotional speech demonstrates that these sounds can also be reliably identified, and rely on similar acoustic cues. A series of studies with a pre-literate Namibian tribe shows that non-verbal vocalisations can be recognized across cultures. An fMRI study carried out to investigate the neural processing of non-verbal vocalisations of emotions is presented. The results show activation in pre-motor regions arising from passive listening to non-verbal emotional vocalisations, suggesting neural auditory-motor interactions in the perception of these sounds. In sum, this thesis demonstrates that non-verbal vocalisations of emotions are reliably identifiable tokens of information that belong to discrete categories. These vocalisations are recognisable across vastly different cultures and thus seem to, like facial expressions of emotions, comprise human universals. Listeners rely mainly on pitch and pitch variation to identify emotions in non verbal vocalisations, which differs with the cues used to comprehend speech. When listening to others' emotional vocalisations, a neural system of preparatory motor activation is engaged
    • 

    corecore