110,207 research outputs found
The role of infants’ mother-directed gaze, maternal sensitivity, and emotion recognition in childhood callous unemotional behaviours
While some children with callous unemotional (CU) behaviours show difficulty recognizing emotional expressions, the underlying developmental pathways are not well understood. Reduced infant attention to the caregiver's face and a lack of sensitive parenting have previously been associated with emerging CU features. The current study examined whether facial emotion recognition mediates the association between infants' mother-directed gaze, maternal sensitivity, and later CU behaviours. Participants were 206 full-term infants and their families from a prospective longitudinal study, the Durham Child Health and Development Study (DCHDS). Measures of infants' mother-directed gaze, and maternal sensitivity were collected at 6 months, facial emotion recognition performance at 6 years, and CU behaviours at 7 years. A path analysis showed a significant effect of emotion recognition predicting CU behaviours (β = -0.275, S.E. = 0.084, p = 0.001). While the main effects of infants' mother-directed gaze and maternal sensitivity were not significant, their interaction significantly predicted CU behaviours (β = 0.194, S.E. = 0.081, p = 0.016) with region of significance analysis showing a significant negative relationship between infant gaze and later CU behaviours only for those with low maternal sensitivity. There were no indirect effects of infants' mother-directed gaze, maternal sensitivity or the mother-directed gaze by maternal sensitivity interaction via emotion recognition. Emotion recognition appears to act as an independent predictor of CU behaviours, rather than mediating the relationship between infants' mother-directed gaze and maternal sensitivity with later CU behaviours. This supports the idea of multiple risk factors for CU behaviours
Recommended from our members
Abstract expressions of affect
What form should happiness take? And how is disgust shaped? This research investigates how synthetic affective expressions can be designed with minimal reference to the human body. The authors propose that the recognition and attribution of affect expression can be triggered by appropriately presenting the bare essentials used in the mental processes that mediate the recognition and attribution of affect. The novelty of the proposed approach lies in the fact that it is based on mental processes involved in the recognition of affect, independent of the configuration of the human body and face. The approach is grounded in (a) research on the role of abstraction in perception, (b) the elementary processes and features relevant to visual emotion recognition and emotion attribution, and (c) how such features can be used (and combined) to generate a synthetic emotion expression. To further develop the argument for this approach they present a pilot study that shows the feasibility of combining affective features independently of the human configuration by using abstraction to create consistent emotional attributions. Finally, the authors discuss the potential implications of their approach for the design of affective robots. The developed design approach promises a maximization of freedom to integrate intuitively understandable affective expressions with other morphological design factors a technology may require, providing synthetic affective expressions that suit the inherently artificial and applied nature of affective technology
Preschoolers' attribution of affect to music: a comparison between vocal and instrumental performance
Research has shown inconsistent results concerning the ability of young children to identify musical emotion. This study explores the influence of the type of musical performance (vocal vs. instrumental) on children’s affect identification. Using an independent-group design, novel child-directed music was presented in three conditions: instrumental, vocal-only, and song (instrumental plus vocals) to 3- to 6-year-olds previously screened for language development (n = 76). A forced-choice task was used in which children chose a face expressing the emotion matching each musical track. All performance conditions comprised ‘happy’ (major mode/fast tempo) and ‘sad’ (minor mode/slow tempo) tracks. Nonsense syllables rather than words were used in the vocals in order to avoid the influence of lyrics on children's decisions. The results showed that even the younger children were able to identify correctly the intended emotion in music, although ‘happy’ music was more readily recognized and recognition appeared facilitated in the instrumental condition. Performance condition interacted with gender
Deficient auditory emotion processing but intact emotional multisensory integration in alexithymia
Alexithymia has been associated with emotion recognition deficits in both auditory and visual domains. Although emotions are inherently multimodal in daily life, little is known regarding abnormalities of emotional multisensory integration (eMSI) in relation to alexithymia. Here, we employed an emotional Stroop-like audiovisual task while recording event-related potentials (ERPs) in individuals with high alexithymia levels (HA) and low alexithymia levels (LA). During the task, participants had to indicate whether a voice was spoken in a sad or angry prosody while ignoring the simultaneously presented static face which could be either emotionally congruent or incongruent to the human voice. We found that HA performed worse and showed higher P2 amplitudes than LA independent of emotion congruency. Furthermore, difficulties in identifying and describing feelings were positively correlated with the P2 component, and P2 correlated negatively with behavioral performance. Bayesian statistics showed no group differences in eMSI and classical integration-related ERP components (N1 and N2). Although individuals with alexithymia indeed showed deficits in auditory emotion recognition as indexed by decreased performance and higher P2 amplitudes, the present findings suggest an intact capacity to integrate emotional information from multiple channels in alexithymia. Our work provides valuable insights into the relationship between alexithymia and neuropsychological mechanisms of emotional multisensory integration
- …