95 research outputs found

    Event-related potentials to task-irrelevant changes in facial expressions

    Get PDF
    Abstract Background Numerous previous experiments have used oddball paradigm to study change detection. This paradigm is applied here to study change detection of facial expressions in a context which demands abstraction of the emotional expression-related facial features among other changing facial features. Methods Event-related potentials (ERPs) were recorded in adult humans engaged in a demanding auditory task. In an oddball paradigm, repeated pictures of faces with a neutral expression ('standard', p = .9) were rarely replaced by pictures with a fearful ('fearful deviant', p = .05) or happy ('happy deviant', p = .05) expression. Importantly, facial identities changed from picture to picture. Thus, change detection required abstraction of facial expression from changes in several low-level visual features. Results ERPs to both types of deviants differed from those to standards. At occipital electrode sites, ERPs to deviants were more negative than ERPs to standards at 150–180 ms and 280–320 ms post-stimulus. A positive shift to deviants at fronto-central electrode sites in the analysis window of 130–170 ms post-stimulus was also found. Waveform analysis computed as point-wise comparisons between the amplitudes elicited by standards and deviants revealed that the occipital negativity emerged earlier to happy deviants than to fearful deviants (after 140 ms versus 160 ms post-stimulus, respectively). In turn, the anterior positivity was earlier to fearful deviants than to happy deviants (110 ms versus 120 ms post-stimulus, respectively). Conclusion ERP amplitude differences between emotional and neutral expressions indicated pre-attentive change detection of facial expressions among neutral faces. The posterior negative difference at 150–180 ms latency resembled visual mismatch negativity (vMMN) – an index of pre-attentive change detection previously studied only to changes in low-level features in vision. The positive anterior difference in ERPs at 130–170 ms post-stimulus probably indexed pre-attentive attention orienting towards emotionally significant changes. The results show that the human brain can abstract emotion related features of faces while engaged to a demanding task in another sensory modality.peerReviewe

    Explicit behavioral detection of visual changes develops without their implicit neurophysiological detectability

    Get PDF
    Change blindness is a failure of reporting major changes across consecutive images if separated, e.g., by a brief blank interval. Successful change detection across interrupts requires focal attention to the changes. However, findings of implicit detection of visual changes during change blindness have raised the question of whether the implicit mode is necessary for development of the explicit mode. To this end, we recorded the visual mismatch negativity (vMMN) of the event-related potentials (ERPs) of the brain, an index of implicit pre-attentive visual change detection, in adult humans performing an oddball-variant of change blindness flicker task. Images of 500 ms in duration were presented repeatedly in continuous sequences, alternating with a blank interval (either 100 ms or 500 ms in duration throughout a stimulus sequence). Occasionally (P = 0.2), a change (referring to color changes, omissions, or additions of objects or their parts in the image) was present. The participants attempted to explicitly (via voluntary button press) detect the occasional change. With both interval durations, it took 10–15 change presentations in average for the participants to eventually detect the changes explicitly in a sequence, the 500 ms interval only requiring a slightly longer exposure to the series than the 100 ms one. Nevertheless, prior to this point of explicit detectability, the implicit detection of the changes vMMN could only be observed with the 100 ms intervals. These findings of explicit change detection developing with and without implicit change detection may suggest that the two modes of change detection recruit independent neural mechanisms

    Decreased intersubject synchrony in dynamic valence ratings of sad movie contents in dysphoric individuals

    Get PDF
    Emotional reactions to movies are typically similar between people. However, depressive symptoms decrease synchrony in brain responses. Less is known about the effect of depressive symptoms on intersubject synchrony in conscious stimulus-related processing. In this study, we presented amusing, sad and fearful movie clips to dysphoric individuals (those with elevated depressive symptoms) and control participants to dynamically rate the clips' valences (positive vs. negative). We analysed both the valence ratings' mean values and intersubject correlation (ISC). We used electrodermal activity (EDA) to complement the measurement in a separate session. There were no group differences in either the EDA or mean valence rating values for each movie type. As expected, the valence ratings' ISC was lower in the dysphoric than the control group, specifically for the sad movie clips. In addition, there was a negative relationship between the valence ratings' ISC and depressive symptoms for sad movie clips in the full sample. The results are discussed in the context of the negative attentional bias in depression. The findings extend previous brain activity results of ISC by showing that depressive symptoms also increase variance in conscious ratings of valence of stimuli in a mood-congruent manner.Peer reviewe

    Effects of conversation content on viewing dyadic conversations

    Get PDF
    People typically follow conversations closely with their gaze. We asked whether this viewing is influenced by what is actually said in the conversation and by the viewer’s psychological condition. We recorded the eye movements of healthy (N = 16) and depressed (N = 25) participants while they were viewing video clips. Each video showed two people, each speaking one line of dialogue about socio-emotionally important (i.e., personal) or unimportant topics (matter-of-fact). Between the spoken lines, the viewers made more saccadic shifts between the discussants, and looked more at the second speaker, in personal vs. matter-of-fact conversations. Higher depression scores were correlated with less looking at the currently speaking discussant. We conclude that subtle social attention dynamics can be detected from eye movements and that these dynamics are sensitive to the observer’s psychological condition, such as depression

    The effect of sad mood on early sensory event-related potentials to task-irrelevant faces

    Get PDF
    It has been shown that the perceiver's mood affects the perception of emotional faces, but it is not known how mood affects preattentive brain responses to emotional facial expressions. To examine the question, we experimentally induced sad and neutral mood in healthy adults before presenting them with task-irrelevant pictures of faces while an electroencephalography was recorded. Sad, happy, and neutral faces were presented to the participants in an ignore oddball condition. Differential responses (emotional – neutral) for the P1, N170, and P2 amplitudes were extracted and compared between neutral and sad mood conditions. Emotional facial expressions modulated all the components, and an interaction effect of expression by mood was found for P1: an emotional modulation to happy faces, which was found in neutral mood condition, disappeared in sad mood condition. For N170 and P2, we found larger response amplitudes for both emotional faces, regardless of the mood. The results add to the previous behavioral findings showing that mood already affects low-level cortical feature encoding of task-irrelevant faces.publishedVersionPeer reviewe

    Passive exposure to speech sounds induces long-term memory representations in the auditory cortex of adult rats

    Get PDF
    Experience-induced changes in the functioning of the auditory cortex are prominent in early life, especially during a critical period. Although auditory perceptual learning takes place automatically during this critical period, it is thought to require active training in later life. Previous studies demonstrated rapid changes in single-cell responses of anesthetized adult animals while exposed to sounds presented in a statistical learning paradigm. However, whether passive exposure to sounds can form long-term memory representations remains to be demonstrated. To investigate this issue, we first exposed adult rats to human speech sounds for 3 consecutive days, 12 h/d. Two groups of rats exposed to either spectrotemporal or tonal changes in speech sounds served as controls for each other. Then, electrophysiological brain responses from the auditory cortex were recorded to the same stimuli. In both the exposure and test phase statistical learning paradigm, was applied. The exposure effect was found for the spectrotemporal sounds, but not for the tonal sounds. Only the animals exposed to spectrotemporal sounds differentiated subtle changes in these stimuli as indexed by the mismatch negativity response. The results point to the occurrence of long-term memory traces for the speech sounds due to passive exposure in adult animals.Peer reviewe

    Neural generators of the frequency-following response elicited to stimuli of low and high frequency: a magnetoencephalographic (MEG) study

    Full text link
    The frequency-following response (FFR) to periodic complex sounds has gained recent interest in auditory cognitive neuroscience as it captures with great fidelity the tracking accuracy of the periodic sound features in the ascending auditory system. Seminal studies suggested the FFR as a correlate of subcortical sound encoding, yet recent studies aiming to locate its sources challenged this assumption, demonstrating that FFR receives some contribution from the auditory cortex. Based on frequency-specific phase-locking capabilities along the auditory hierarchy, we hypothesized that FFRs to higher frequencies would receive less cortical contribution than those to lower frequencies, hence supporting a major subcortical involvement for these high frequency sounds. Here, we used a magnetoencephalographic (MEG) approach to trace the neural sources of the FFR elicited in healthy adults (N = 19) to low (89 Hz) and high (333 Hz) frequency sounds. FFRs elicited to the high and low frequency sounds were clearly observable on MEG and comparable to those obtained in simultaneous electroencephalographic recordings. Distributed source modeling analyses revealed midbrain, thalamic, and cortical contributions to FFR, arranged in frequency-specific configurations. Our results showed that the main contribution to the highfrequency sound FFR originated in the inferior colliculus and the medial geniculate body of the thalamus, with no significant cortical contribution. In contrast, the low-frequency sound FFR had a major contribution located in the auditory cortices, and also received contributions originating in the midbrain and thalamic structures. These findings support the multiple generator hypothesis of the FFR and are relevant for our understanding of the neural encoding of sounds along the auditory hierarchy, suggesting a hierarchical organization of periodicity encoding

    Effects of Conversation Content on Viewing Dyadic Conversations

    Get PDF
    People typically follow conversations closely with their gaze. We asked whether this viewing is influenced by what is actually said in the conversation and by the viewer's psychological condition. We recorded the eye movements of healthy (N = 16) and depressed (N = 25) participants while they were viewing video clips. Each video showed two people, each speaking one line of dialogue about socio-emotionally important (i.e., personal) or unimportant topics (matter-of-fact). Between the spoken lines, the viewers made more saccadic shifts between the discussants, and looked more at the second speaker, in personal vs. matter-of-fact conversations. Higher depression scores were correlated with less looking at the currently speaking discussant. We conclude that subtle social attention dynamics can be detected from eye movements and that these dynamics are sensitive to the observer's psychological condition, such as depression
    corecore