110,207 research outputs found

    The role of infants’ mother-directed gaze, maternal sensitivity, and emotion recognition in childhood callous unemotional behaviours

    Get PDF
    While some children with callous unemotional (CU) behaviours show difficulty recognizing emotional expressions, the underlying developmental pathways are not well understood. Reduced infant attention to the caregiver's face and a lack of sensitive parenting have previously been associated with emerging CU features. The current study examined whether facial emotion recognition mediates the association between infants' mother-directed gaze, maternal sensitivity, and later CU behaviours. Participants were 206 full-term infants and their families from a prospective longitudinal study, the Durham Child Health and Development Study (DCHDS). Measures of infants' mother-directed gaze, and maternal sensitivity were collected at 6 months, facial emotion recognition performance at 6 years, and CU behaviours at 7 years. A path analysis showed a significant effect of emotion recognition predicting CU behaviours (β = -0.275, S.E. = 0.084, p = 0.001). While the main effects of infants' mother-directed gaze and maternal sensitivity were not significant, their interaction significantly predicted CU behaviours (β = 0.194, S.E. = 0.081, p = 0.016) with region of significance analysis showing a significant negative relationship between infant gaze and later CU behaviours only for those with low maternal sensitivity. There were no indirect effects of infants' mother-directed gaze, maternal sensitivity or the mother-directed gaze by maternal sensitivity interaction via emotion recognition. Emotion recognition appears to act as an independent predictor of CU behaviours, rather than mediating the relationship between infants' mother-directed gaze and maternal sensitivity with later CU behaviours. This supports the idea of multiple risk factors for CU behaviours

    Preschoolers' attribution of affect to music: a comparison between vocal and instrumental performance

    Get PDF
    Research has shown inconsistent results concerning the ability of young children to identify musical emotion. This study explores the influence of the type of musical performance (vocal vs. instrumental) on children’s affect identification. Using an independent-group design, novel child-directed music was presented in three conditions: instrumental, vocal-only, and song (instrumental plus vocals) to 3- to 6-year-olds previously screened for language development (n = 76). A forced-choice task was used in which children chose a face expressing the emotion matching each musical track. All performance conditions comprised ‘happy’ (major mode/fast tempo) and ‘sad’ (minor mode/slow tempo) tracks. Nonsense syllables rather than words were used in the vocals in order to avoid the influence of lyrics on children's decisions. The results showed that even the younger children were able to identify correctly the intended emotion in music, although ‘happy’ music was more readily recognized and recognition appeared facilitated in the instrumental condition. Performance condition interacted with gender

    Deficient auditory emotion processing but intact emotional multisensory integration in alexithymia

    Get PDF
    Alexithymia has been associated with emotion recognition deficits in both auditory and visual domains. Although emotions are inherently multimodal in daily life, little is known regarding abnormalities of emotional multisensory integration (eMSI) in relation to alexithymia. Here, we employed an emotional Stroop-like audiovisual task while recording event-related potentials (ERPs) in individuals with high alexithymia levels (HA) and low alexithymia levels (LA). During the task, participants had to indicate whether a voice was spoken in a sad or angry prosody while ignoring the simultaneously presented static face which could be either emotionally congruent or incongruent to the human voice. We found that HA performed worse and showed higher P2 amplitudes than LA independent of emotion congruency. Furthermore, difficulties in identifying and describing feelings were positively correlated with the P2 component, and P2 correlated negatively with behavioral performance. Bayesian statistics showed no group differences in eMSI and classical integration-related ERP components (N1 and N2). Although individuals with alexithymia indeed showed deficits in auditory emotion recognition as indexed by decreased performance and higher P2 amplitudes, the present findings suggest an intact capacity to integrate emotional information from multiple channels in alexithymia. Our work provides valuable insights into the relationship between alexithymia and neuropsychological mechanisms of emotional multisensory integration
    • …
    corecore