2,495 research outputs found

    Parental brain: cerebral areas activated by infant cries and faces. A comparison between different populations of parents and not.

    Get PDF
    Literature about parenting traditionally focused on caring behaviors and parental representations. Nowadays, an innovative line of research, interested in evaluating the neural areas and hormones implicated in the nurturing and caregiving responses, has developed. The only way to permit a newborn to survive and grow up is to respond to his needs and in order to succeed it is necessary, \ufb01rst of all, that the adults around him understand what his needs are. That is why adults\u2019 capacity of taking care of infants cannot disregard from some biological mechanisms, which allow them to be more responsive to the progeny and to infants in general. Many researches have proved that exist speci\ufb01c neural basis activating in response to infant evolutionary stimuli, such as infant cries and infant emotional facial expression. There is a sort of innate predisposition in human adults to respond to infants\u2019 signals, in order to satisfy their need and allow them to survive and become young adults capable of taking care of themselves. This article focuses on research that has investigated, in the last decade, the neural circuits underlying parental behavioral responses. Moreover, the paper compares the results of those studies that investigated the neural responses to infant stimuli under different conditions: familiar versus unknown children, parents versus non-parents and normative versus clinical samples (depression, addiction, adolescence, and PTSD)

    Audiovisual integration of emotional signals from others' social interactions

    Get PDF
    Audiovisual perception of emotions has been typically examined using displays of a solitary character (e.g., the face-voice and/or body-sound of one actor). However, in real life humans often face more complex multisensory social situations, involving more than one person. Here we ask if the audiovisual facilitation in emotion recognition previously found in simpler social situations extends to more complex and ecological situations. Stimuli consisting of the biological motion and voice of two interacting agents were used in two experiments. In Experiment 1, participants were presented with visual, auditory, auditory filtered/noisy, and audiovisual congruent and incongruent clips. We asked participants to judge whether the two agents were interacting happily or angrily. In Experiment 2, another group of participants repeated the same task, as in Experiment 1, while trying to ignore either the visual or the auditory information. The findings from both experiments indicate that when the reliability of the auditory cue was decreased participants weighted more the visual cue in their emotional judgments. This in turn translated in increased emotion recognition accuracy for the multisensory condition. Our findings thus point to a common mechanism of multisensory integration of emotional signals irrespective of social stimulus complexity

    Infusing Context Into Emotion Perception Impacts Emotion Decoding Accuracy

    Get PDF
    The accurate decoding of facial emotion expressions lies at the center of many research traditions in psychology. Much of this research, while paying lip service to the importance of context in emotion perception, has used stimuli that were carefully created to be deprived of contextual information. The participants' task is to associate the expression shown in the face with a correct label, essentially changing a social perception task into a cognitive task. In fact, in many cases, the task can be carried out correctly without engaging emotion recognition at all. The present article argues that infusing context in emotion perception does not only add an additional source of information but changes the way that participants approach the task by rendering it a social perception task rather than a cognitive task. Importantly, distinguishing between accuracy (perceiving the intended emotions) and bias (perceiving additional emotions to those intended) leads to a more nuanced understanding of social emotion perception. Results from several studies that use the Assessment of Contextual Emotions demonstrate the significance and social functionality of simultaneously considering emotion decoding accuracy and bias for social interaction in different cultures, their key personality and societal correlates, and their function for close relationships processes.Peer Reviewe

    Sensorimotor representation of observed dyadic actions with varying agent involvement: an EEG mu study

    Get PDF
    Observation of others’ actions activates motor representations in sensorimotor cortex. Although action observation in the real-world often involves multiple agents displaying varying degrees of action involvement, most lab studies on action observation studied individual actions. We recorded EEG-mu suppression over sensorimotor cortex to investigate how the multi-agent nature of observed hand/arm actions is incorporated in sensorimotor action representations. Hereto we manipulated the extent of agent involvement in dyadic interactions presented in videos. In all clips two agents were present, of which agent-1 always performed the same action, while the involvement of agent-2 differed along three levels: (1) passive and uninvolved, (2) passively involved, (3) actively involved. Additionally, a no-action condition was presented. The occurrence of these four conditions was predictable thanks to cues at the start of each trial, which allowed to study possible mu anticipation effects. Dyadic interactions in which agent-2 was actively involved resulted in increased power suppression of the mu rhythm compared to dyadic interactions in which agent-2 was passively involved. The latter did not differ from actions in which agent-2 was present but not involved. No anticipation effects were found. The results suggest that the sensorimotor representation of a dyadic interaction takes into account the simultaneously performed bodily articulations of both agents, but no evidence was found for incorporation of their static articulated postures
    • …
    corecore