14 research outputs found

    Temporal processing of emotional stimuli: The capture and release of attention by angry faces

    Get PDF
    Neuroimaging data suggest that emotional information, especially threatening faces, automatically captures attention and receives rapid processing. While this is consistent with the majority of behavioral data, behavioral studies of the attentional blink (AB) additionally reveal that aversive emotional first target (T1) stimuli are associated with prolonged attentional engagement or “dwell” time. One explanation for this difference is that few AB studies have utilized manipulations of facial emotion as the T1. To address this, schematic faces varying in expression (neutral, angry, happy) served as the T1 in the current research. Results revealed that the blink associated with an angry T1 face was, primarily, of greater magnitude than that associated with either a neutral or happy T1 face, and also that initial recovery from this processing bias was faster following angry, compared with happy, T1 faces. The current data therefore provide important information regarding the time-course of attentional capture by angry faces: Angry faces are associated with both the rapid capture and rapid release of attention.N/

    When is a face a face? Schematic faces, emotion, attention and the N170

    Get PDF
    Emotional facial expressions provide important non-verbal cues as to the imminent behavioural intentions of a second party. Hence, within emotion science the processing of faces (emotional or otherwise) has been at the forefront of research. Notably, however, such research has led to a number of debates including the ecological validity of utilising schematic faces in emotion research, and the face-selectively of N170. In order to investigate these issues, we explored the extent to which N170 is modulated by schematic faces, emotional expression and/or selective attention. Eighteen participants completed a three-stimulus oddball paradigm with two scrambled faces as the target and standard stimuli (counter-balanced across participants), and schematic angry, happy and neutral faces as the oddball stimuli. Results revealed that the magnitude of the N170 associated with the target stimulus was: (i) significantly greater than that elicited by the standard stimulus, (ii) comparable with the N170 elicited by the neutral and happy schematic face stimuli, and (iii) significantly reduced compared to the N170 elicited by the angry schematic face stimulus. These findings extend current literature by demonstrating N170 can be modulated by events other than those associated with structural face encoding; i.e. here, the act of labelling a stimulus a ‘target’ to attend to modulated the N170 response. Additionally, the observation that schematic faces demonstrate similar N170 responses to those recorded for real faces and, akin to real faces, angry schematic faces demonstrated heightened N170 responses, suggests caution should be taken before disregarding schematic facial stimuli in emotion processing research per se

    Verbal labels selectively bias brain responses to high-energy foods.

    Get PDF
    The influence of external factors on food preferences and choices is poorly understood. Knowing which and how food-external cues impact the sensory processing and cognitive valuation of food would provide a strong benefit toward a more integrative understanding of food intake behavior and potential means of interfering with deviant eating patterns to avoid detrimental health consequences for individuals in the long run. We investigated whether written labels with positive and negative (as opposed to 'neutral') valence differentially modulate the spatio-temporal brain dynamics in response to the subsequent viewing of high- and low-energetic food images. Electrical neuroimaging analyses were applied to visual evoked potentials (VEPs) from 20 normal-weight participants. VEPs and source estimations in response to high- and low- energy foods were differentially affected by the valence of preceding word labels over the ~260-300 ms post-stimulus period. These effects were only observed when high-energy foods were preceded by labels with positive valence. Neural sources in occipital as well as posterior, frontal, insular and cingulate regions were down-regulated. These findings favor cognitive-affective influences especially on the visual responses to high-energetic food cues, potentially indicating decreases in cognitive control and goal-adaptive behavior. Inverse correlations between insular activity and effectiveness in food classification further indicate that this down-regulation directly impacts food-related behavior

    Models of hemispheric specialization in facial emotion perception - a reevaluation

    Get PDF
    A considerable amount of research on functional cerebral asymmetries (FCAs) for facial emotion perception has shown conflicting support for three competing models: (i) the Right Hemisphere Hypothesis, (ii) the Valence-Specific Hypothesis, and (iii) the Approach/Withdrawal model. However, the majority of studies evaluating the Right Hemisphere or the Valence-Specific Hypotheses are rather limited by the small number of emotional expressions used. In addition, it is difficult to evaluate the Approach/Withdrawal Hypothesis due to insufficient data on anger and FCAs. The aim of the present study was (a) to review visual half field (VHF) studies of hemispheric specialization in facial emotion perception and (b) to reevaluate empirical evidence with respect to all three partly conflicting hypotheses. Results from the present study revealed a left visual field (LVF)/right hemisphere advantage for the perception of angry, fearful, and sad facial expressions and a right visual field (RVF)/left hemisphere advantage for the perception of happy expressions. Thus, FCAs for the perception of specific facial emotions do not fully support the Right Hemisphere Hypothesis, the Valence-Specific Hypothesis, or the Approach/Withdrawal model. A systematic literature review, together with the results of the present study, indicate a consistent LVF/right hemisphere advantage only for a subset of negative emotions including anger, fear and sadness, rather suggesting a “negative (only) valence model.

    Electrophysiological correlates of social cognition and emotions during a visual mismatch task - a predictive processing view

    Get PDF
    openThe thesis project investigates the electrophysiological correlates of emotions and social cognition in healthy participants, based on the theoretical framework of Predictive Processing. The participants were recruited for HD-EEG acquisition both at rest and during the performance of a Visual Mismatch Negativity task, an event-related potential elicited by the appearance of deviant stimuli among standard stimuli. In particular, faces with a neutral emotional condition represent the standard stimuli, while the deviant ones are made up of faces expressing emotions (happiness or fear) and faces of the opposite gender of the standard stimuli. This allows to investigate the modulation of vMMN dictated by the different valence of the stimuli. The duration of the experiment is 35 minutes (5 minutes of resting state and 30 of vMMN). The aim is to investigate the role that prediction processes play in social cognition skills, particularly in the emotion processing, also investigating how the neural signal responds to the violation of expected patterns and which networks are recruited in order to update the predictions violated. The results obtained will also be used to study social cognition in neurological patients, who often show deficits in this ability in everyday life

    Models of hemispheric specialization in facial emotion perception—a reevaluation.

    Full text link

    Fixation to features and neural processing of facial expressions in a gender discrimination task

    Get PDF
    The final publication is available at Elsevier via http://dx.doi.org/10.1016/j.bandc.2015.05.007. © 2015. This manuscript version is made available under the CC-BY-NC-ND 4.0 license http://creativecommons.org/licenses/by-nc-nd/4.0/Early face encoding, as reflected by the N170 ERP component, is sensitive to fixation to the eyes. Whether this sensitivity varies with facial expressions of emotion and can also be seen on other ERP components such as P1 and EPN, was investigated. Using eye-tracking to manipulate fixation on facial features, we found the N170 to be the only eye-sensitive component and this was true for fearful, happy and neutral faces. A different effect of fixation to features was seen for the earlier P1 that likely reflected general sensitivity to face position. An early effect of emotion (∼120 ms) for happy faces was seen at occipital sites and was sustained until ∼350 ms post-stimulus. For fearful faces, an early effect was seen around 80 ms followed by a later effect appearing at ∼150 ms until ∼300 ms at lateral posterior sites. Results suggests that in this emotion-irrelevant gender discrimination task, processing of fearful and happy expressions occurred early and largely independently of the eye-sensitivity indexed by the N170. Processing of the two emotions involved different underlying brain networks active at different times

    Age Differences in the Impact of Emotional Cues on Subsequent Target Detection

    Get PDF
    Emotional cues within the environment capture our attention and influence how we perceive our surroundings. Past research has shown that emotional cues presented before the detection of a perceptual gap can actually impair the perception of elementary visual features (e.g., the lack of detail creating a spatial gap) while simultaneously improving the perception of fast temporal features of vision (e.g., the rapid onset, offset, and re-emergence of a stimulus). This effect has been attributed to amygdalar enhancements of visual inputs conveying emotional features along magnocellular channels. The current study compared participants’ ability to detect spatial and temporal gaps in simple stimuli (a Landolt Circle) after first being exposed to a facial cue in the periphery. The study was an attempt to replicate past research using younger adult samples while also extending these findings to an older adult sample. Unlike younger adults, older adults generally display an attentional bias toward positive instead of negative emotional facial expressions. It is not clear if this positivity bias is strictly driven by cognitive control processes or if there is a change in the human visual system with age that reduces the amplification of negative emotive expressions by the amygdala. The current study used psychophysical data to determine if the rapid presentation of an emotional cue and subsequent perceptual target to older adults leads to the same benefit to temporal vision evinced by younger adults or if amygdalocortical enhancements to perception degrade with age. The current study was only able to partly replicate findings from past research. The negative facial cues that were presented in the periphery did not lead to an enhancement in temporal gap detection for the younger adult sample nor a reduction in spatial gap detection. In fact, the opposite was found. Younger adults’ spatial gap detection benefited from the negative emotional cues. The negative and neutral emotional cues had no effect on the older adult sample. The older adults’ performance on both gap detection tasks was not impacted by the emotional cue

    Emotion Processing by ERP Combined with Development and Plasticity

    Get PDF

    Functional cerebral asymmetries of emotional processes in the healthy and bipolar brain

    Get PDF
    The perception and processing of emotions are of primary importance for social interaction, which confers faculties such as inferring what another person’s feels. Brain organisation of emotion perception has shown to primarily involve right hemisphere functioning. However, the brain may be functionally organised according to fundamental aspects of emotion such as valence, rather than involving processing of emotions in general. It should be noted, however, that emotion perception is not merely a perceptual process consisting in the input of emotional information, but also involves one’s emotional response. Therefore, the functional brain organisation of emotional processing may also be influenced by emotional experience. An experimental model for testing functional cerebral asymmetries (FCAs) of valenced emotional experience is uniquely found in bipolar disorder (BD) involving impaired ability to regulate emotions and eventually leading to depressive or manic episodes. Previous models have only explained hemispheric asymmetries for manic and depressive mood episodes, but not for BD euthymia. The present thesis sought to investigate FCAs in emotional processing in two major ways. First, FCAs underlying facial emotion perception under normal functioning was examined in healthy controls. Secondly, functional brain organisation in emotional processing was further investigated by assessing FCAs in the bipolarity continuum, used as an experimental model for studying the processing of emotions. In contrast with previous asymmetry models, results suggested a right hemisphere involvement in emotional experience regardless of valence. Atypical FCAs were found in euthymic BD patients reflecting inherent aspects of BD functional brain organisation that are free of symptomatic influence. Also, BD patients exhibited atypical connectivity in a default amygdala network particularly affecting the right hemisphere, suggesting intrinsic mechanisms associated with internal emotional states. Last, BD patients were associated with a reduced right hemisphere specialisation in visuospatial attention, therefore suggesting that right hemisphere dysfunction can also affect non-emotional processes. Taken together, the findings emphasize a BD continuum model relying on euthymia as a bridging state between usual mood and acute mood phases
    corecore