893 research outputs found

    Event-related alpha suppression in response to facial motion

    Get PDF
    This article has been made available through the Brunel Open Access Publishing Fund.While biological motion refers to both face and body movements, little is known about the visual perception of facial motion. We therefore examined alpha wave suppression as a reduction in power is thought to reflect visual activity, in addition to attentional reorienting and memory processes. Nineteen neurologically healthy adults were tested on their ability to discriminate between successive facial motion captures. These animations exhibited both rigid and non-rigid facial motion, as well as speech expressions. The structural and surface appearance of these facial animations did not differ, thus participants decisions were based solely on differences in facial movements. Upright, orientation-inverted and luminance-inverted facial stimuli were compared. At occipital and parieto-occipital regions, upright facial motion evoked a transient increase in alpha which was then followed by a significant reduction. This finding is discussed in terms of neural efficiency, gating mechanisms and neural synchronization. Moreover, there was no difference in the amount of alpha suppression evoked by each facial stimulus at occipital regions, suggesting early visual processing remains unaffected by manipulation paradigms. However, upright facial motion evoked greater suppression at parieto-occipital sites, and did so in the shortest latency. Increased activity within this region may reflect higher attentional reorienting to natural facial motion but also involvement of areas associated with the visual control of body effectors. © 2014 Girges et al

    Effects of emotional study context on immediate and delayed recognition memory: Evidence from event-related potentials

    Get PDF
    Whilst research has largely focused on the recognition of emotional items, emotion may be a more subtle part of our surroundings and conveyed by context rather than by items. Using ERPs, we investigated which effects an arousing context during encoding may have for item-context binding and subsequent familiarity-based and recollection-based item-memory. It has been suggested that arousal could facilitate item-context bindings and by this enhance the contribution of recollection to subsequent memory judgements. Alternatively, arousal could shift attention onto central features of a scene and by this foster unitisation during encoding. This could boost the contribution of familiarity to remembering. Participants learnt neutral objects paired with ecologically highly valid emotional faces whose names later served as neutral cues during an immediate and delayed test phase. Participants identified objects faster when they had originally been studied together with emotional context faces. Items with both neutral and emotional context elicited an early frontal ERP old/new difference (200-400 ms). Neither the neurophysiological correlate for familiarity nor recollection were specific to emotionality. For the ERP correlate of recollection, we found an interaction between stimulus type and day, suggesting that this measure decreased to a larger extend on Day 2 compared with Day 1. However, we did not find direct evidence for delayed forgetting of items encoded in emotional contexts at Day 2. Emotion at encoding might make retrieval of items with emotional context more readily accessible, but we found no significant evidence that emotional context either facilitated familiarity-based or recollection-based item-memory after a delay of 24 h

    Disgust exposure and explicit emotional appraisal enhance the LPP in response to disgusted facial expressions

    Get PDF
    The influence of prior exposure to disgusting imagery and the conscious appraisal of facial expressions were examined in an event-related potential (ERP) experiment. Participants were exposed to either a disgust or a control manipulation and then presented with emotional and neutral expressions. An assessment of the gender of the face was required during half the blocks and an affective assessment of the emotion in the other half. The emotion-related early posterior negativity (EPN) and late positive potential (LPP) ERP components were examined for disgust and neutral stimuli. Results indicated that the EPN was enhanced for disgusted over neutral expressions. Prior disgust exposure modulated the middle phase of the LPP in response to disgusted but not neutral expressions, but only when the emotion of the face was explicitly evaluated. The late LPP was enhanced independently of stimuli when an emotional decision was made. Results demonstrated that exposure to disgusting imagery can affect the subsequent processing of disgusted facial expressions when the emotion is under conscious appraisal

    Change blindness: eradication of gestalt strategies

    Get PDF
    Arrays of eight, texture-defined rectangles were used as stimuli in a one-shot change blindness (CB) task where there was a 50% chance that one rectangle would change orientation between two successive presentations separated by an interval. CB was eliminated by cueing the target rectangle in the first stimulus, reduced by cueing in the interval and unaffected by cueing in the second presentation. This supports the idea that a representation was formed that persisted through the interval before being 'overwritten' by the second presentation (Landman et al, 2003 Vision Research 43149–164]. Another possibility is that participants used some kind of grouping or Gestalt strategy. To test this we changed the spatial position of the rectangles in the second presentation by shifting them along imaginary spokes (by ±1 degree) emanating from the central fixation point. There was no significant difference seen in performance between this and the standard task [F(1,4)=2.565, p=0.185]. This may suggest two things: (i) Gestalt grouping is not used as a strategy in these tasks, and (ii) it gives further weight to the argument that objects may be stored and retrieved from a pre-attentional store during this task

    (Micro)saccade-related potentials during face recognition:A study combining EEG, eye-tracking, and deconvolution modeling

    Get PDF
    Under natural viewing conditions, complex stimuli such as human faces are typically looked at several times in succession, implying that their recognition may unfold across multiple eye fixations. Although electrophysiological (EEG) experiments on face recognition typically prohibit eye movements, participants still execute frequent (micro)saccades on the face, each of which generates its own visuocortical response. This finding raises the question of whether the fixation-related potentials (FRPs) evoked by these tiny gaze shifts also contain psychologically valuable information about face processing. Here we investigated this question by co-recording EEG and eye movements in an experiment with emotional faces (happy, angry, neutral). Deconvolution modeling was used to separate the stimulus-ERPs to face onset from the FRPs generated by subsequent microsaccades-induced refixations on the face. As expected, stimulus-ERPs exhibited typical emotion effects, with a larger early posterior negativity (EPN) for happy/angry compared to neutral faces. Eye-tracking confirmed that participants made small saccades within the face in 98% of the trials. However, while each saccade produced a strong response over visual areas, this response was unaffected by the face’s emotional expression, both for the first and for subsequent (micro)saccades. This finding suggests that the face’s affective content is rapidly evaluated after stimulus onset, leading to only a short-lived sensory enhancement by arousing stimuli that does not repeat itself during immediate refixations. Methodologically, our work demonstrates how eye-tracking and deconvolution modeling can be used to extract several brain responses from each EEG trial, providing insights into neural processing at different latencies after stimulus onset

    Social interactions, emotion and sleep: a systematic review and research agenda

    Get PDF
    Sleep and emotion are closely linked, however the effects of sleep on socio-emotional task performance have only recently been investigated. Sleep loss and insomnia have been found to affect emotional reactivity and social functioning, although results, taken together, are somewhat contradictory. Here we review this advancing literature, aiming to 1) systematically review the relevant literature on sleep and socio-emotional functioning, with reference to the extant literature on emotion and social interactions, 2) summarize results and outline ways in which emotion, social interactions, and sleep may interact, and 3) suggest key limitations and future directions for this field. From the reviewed literature, sleep deprivation is associated with diminished emotional expressivity and impaired emotion recognition, and this has particular relevance for social interactions. Sleep deprivation also increases emotional reactivity; results which are most apparent with neuro-imaging studies investigating amygdala activity and its prefrontal regulation. Evidence of emotional dysregulation in insomnia and poor sleep has also been reported. In general, limitations of this literature include how performance measures are linked to self-reports, and how results are linked to socio-emotional functioning. We conclude by suggesting some possible future directions for this field

    The role of contexts in face processing:Behavioral and ERP studies

    Get PDF

    Exploring the Impact of Affective Processing on Visual Perception of Large-Scale Spatial Environments

    Get PDF
    This thesis explores the interaction between emotions and visual perception using large scale spatial environment as the medium of this interaction. Emotion has been documented to have an early effect on scene perception (Olofsson, Nordin, Sequeira, & Polich, 2008). Yet, most popularly-used scene stimuli, such as the IAPS or GAPED stimulus sets often depict salient objects embedded in naturalistic backgrounds, or “events” which contain rich social information, such as human faces or bodies. And thus, while previous studies are instrumental to our understanding of the role that social-emotion plays in visual perception, they do not isolate the effect of emotion from the social effects in order to address the specific role that emotion plays in scene recognition – defined here as the recognition of large-scale spatial environments. To address this question, we examined how early emotional valence and arousal impact scene processing, by conducting an Event-Related Potential (ERP) study using a well-controlled set of scene stimuli that reduced the social factor, by focusing on natural scenes which did not contain human faces or actors. The study comprised of two stages. First, we collected affective ratings of 440 natural scene images selected specifically so they will not contain human faces or bodies. Based on these ratings, we divided our scene stimuli into three distinct categories: pleasant, unpleasant, and neutral. In the second stage, we recorded ERPs from a separate group of participants as they viewed a subset of 270 scenes ranked highest in each of their respective categories. Scenes were presented for 200ms, back-masked using white noise, while participants performed an orthogonal fixation task. We found that emotional valence had significant impact on scene perception in which unpleasant scenes had higher P1, N1 and P2 peaks. However, we studied the relative contribution of emotional effect and low-level visual features using dominance analysis which can compare the relative importance of predictors in multiple regression. We found that the relative contribution of emotional effect and low-level visual features (operationalized by the GIST model, (Oliva & Torralba, 2006)) had complete dominance over emotional effects (both valence and arousal) on most early peaks and areas under the curve (AUC). We also found out that affective ratings were significantly influenced by the GIST intensities of the scenes in which scenes with high GIST intensities were more likely to be rated as unpleasant. We concluded that emotional impact in our stimulus set of natural scenes was mostly due to bottom-up effect on scene perception and that controlling for the low-level visual features (particularly the GIST intensity) would be an important step to confirm the affective impact on scene perception

    Behavioural and neurophysiological signatures in the retrieval of individual memories of recent and remote real-life routine episodic events

    Get PDF
    Autobiographical memory (AM) has been largely investigated as the ability to recollect specific events that belong to an individual's past. However, how we retrieve real-life routine episodes and how the retrieval of these episodes changes with the passage of time remain unclear. Here, we asked participants to use a wearable camera that automatically captured pictures to record instances during a week of their routine life and implemented a deep neural network-based algorithm to identify picture sequences that represented episodic events. We then asked each participant to return to the lab to retrieve AMs for single episodes cued by the selected pictures 1 week, 2 weeks and 6-14 months after encoding while scalp electroencephalographic (EEG) activity was recorded. We found that participants were more accurate in recognizing pictured scenes depicting their own past than pictured scenes encoded in the lab, and that memory recollection of personally experienced events rapidly decreased with the passing of time. We also found that the retrieval of real-life picture cues elicited a strong and positive 'ERP old/new effect' over frontal regions and that the magnitude of this ERP effect was similar throughout memory tests over time. However, we observed that recognition memory induced a frontal theta power decrease and that this effect was mostly seen when memories were tested after 1 and 2 weeks but not after 6-14 months from encoding. Altogether, we discuss the implications for neuroscientific accounts of episodic retrieval and the potential benefits of developing individual-based AM exploration strategies at the clinical level
    corecore