2,968 research outputs found

    Similarities in face and voice cerebral processing

    Get PDF
    In this short paper I illustrate by a few selected examples several compelling similarities in the functional organization of face and voice cerebral processing: (1) Presence of cortical areas selective to face or voice stimuli, also observed in non-human primates, and causally related to perception; (2) Coding of face or voice identity using a “norm-based” scheme; (3) Personality inferences from faces and voices in a same Trustworthiness–Dominance “social space”

    A unified coding strategy for processing faces and voices

    Get PDF
    Both faces and voices are rich in socially-relevant information, which humans are remarkably adept at extracting, including a person's identity, age, gender, affective state, personality, etc. Here, we review accumulating evidence from behavioral, neuropsychological, electrophysiological, and neuroimaging studies which suggest that the cognitive and neural processing mechanisms engaged by perceiving faces or voices are highly similar, despite the very different nature of their sensory input. The similarity between the two mechanisms likely facilitates the multi-modal integration of facial and vocal information during everyday social interactions. These findings emphasize a parsimonious principle of cerebral organization, where similar computational problems in different modalities are solved using similar solutions

    Emotional Cues during Simultaneous Face and Voice Processing: Electrophysiological Insights

    Get PDF
    Both facial expression and tone of voice represent key signals of emotional communication but their brain processing correlates remain unclear. Accordingly, we constructed a novel implicit emotion recognition task consisting of simultaneously presented human faces and voices with neutral, happy, and angry valence, within the context of recognizing monkey faces and voices task. To investigate the temporal unfolding of the processing of affective information from human face-voice pairings, we recorded event-related potentials (ERPs) to these audiovisual test stimuli in 18 normal healthy subjects; N100, P200, N250, P300 components were observed at electrodes in the frontal-central region, while P100, N170, P270 were observed at electrodes in the parietal-occipital region. Results indicated a significant audiovisual stimulus effect on the amplitudes and latencies of components in frontal-central (P200, P300, and N250) but not the parietal occipital region (P100, N170 and P270). Specifically, P200 and P300 amplitudes were more positive for emotional relative to neutral audiovisual stimuli, irrespective of valence, whereas N250 amplitude was more negative for neutral relative to emotional stimuli. No differentiation was observed between angry and happy conditions. The results suggest that the general effect of emotion on audiovisual processing can emerge as early as 200 msec (P200 peak latency) post stimulus onset, in spite of implicit affective processing task demands, and that such effect is mainly distributed in the frontal-central region

    Change blindness: eradication of gestalt strategies

    Get PDF
    Arrays of eight, texture-defined rectangles were used as stimuli in a one-shot change blindness (CB) task where there was a 50% chance that one rectangle would change orientation between two successive presentations separated by an interval. CB was eliminated by cueing the target rectangle in the first stimulus, reduced by cueing in the interval and unaffected by cueing in the second presentation. This supports the idea that a representation was formed that persisted through the interval before being 'overwritten' by the second presentation (Landman et al, 2003 Vision Research 43149–164]. Another possibility is that participants used some kind of grouping or Gestalt strategy. To test this we changed the spatial position of the rectangles in the second presentation by shifting them along imaginary spokes (by ±1 degree) emanating from the central fixation point. There was no significant difference seen in performance between this and the standard task [F(1,4)=2.565, p=0.185]. This may suggest two things: (i) Gestalt grouping is not used as a strategy in these tasks, and (ii) it gives further weight to the argument that objects may be stored and retrieved from a pre-attentional store during this task

    Social interactions, emotion and sleep: a systematic review and research agenda

    Get PDF
    Sleep and emotion are closely linked, however the effects of sleep on socio-emotional task performance have only recently been investigated. Sleep loss and insomnia have been found to affect emotional reactivity and social functioning, although results, taken together, are somewhat contradictory. Here we review this advancing literature, aiming to 1) systematically review the relevant literature on sleep and socio-emotional functioning, with reference to the extant literature on emotion and social interactions, 2) summarize results and outline ways in which emotion, social interactions, and sleep may interact, and 3) suggest key limitations and future directions for this field. From the reviewed literature, sleep deprivation is associated with diminished emotional expressivity and impaired emotion recognition, and this has particular relevance for social interactions. Sleep deprivation also increases emotional reactivity; results which are most apparent with neuro-imaging studies investigating amygdala activity and its prefrontal regulation. Evidence of emotional dysregulation in insomnia and poor sleep has also been reported. In general, limitations of this literature include how performance measures are linked to self-reports, and how results are linked to socio-emotional functioning. We conclude by suggesting some possible future directions for this field

    Tuning the developing brain to emotional body expressions

    Get PDF
    Reading others’ emotional body expressions is an essential social skill. Adults readily recognize emotions from body movements. However, it is unclear when in development infants become sensitive to bodily expressed emotions. We examined event-related brain potentials (ERPs) in 4- and 8-month-old infants in response to point-light displays (PLDs) of happy and fearful body expressions presented in two orientations (upright and inverted). The ERP results revealed that 8-month-olds but not 4-month olds respond sensitively to the orientation and the emotion of the dynamic expressions. Specifically, 8-month-olds showed (i) an early (200–400 ms) orientation-sensitive positivity over frontal and central electrodes, and (ii) a late (700–1100 ms) emotion-sensitive positivity over temporal and parietal electrodes in the right hemisphere. These findings suggest that orientation-sensitive and emotion-sensitive brain processes, distinct in timing and topography, develop between 4 and 8 months of age

    EEG frequency-tagging demonstrates increased left hemispheric involvement and crossmodal plasticity for face processing in congenitally deaf signers

    Get PDF
    In humans, face-processing relies on a network of brain regions predominantly in the right occipito-temporal cortex. We tested congenitally deaf (CD) signers and matched hearing controls (HC) to investigate the experience dependence of the cortical organization of face processing. Specifically, we used EEG frequency-tagging to evaluate: (1) Face-Object Categorization, (2) Emotional Facial-Expression Discrimination and (3) Individual Face Discrimination. The EEG was recorded to visual stimuli presented at a rate of 6 Hz, with oddball stimuli at a rate of 1.2 Hz. In all three experiments and in both groups, significant face discriminative responses were found. Face-Object categorization was associated to a relative increased involvement of the left hemisphere in CD individuals compared to HC individuals. A similar trend was observed for Emotional Facial-Expression discrimination but not for Individual Face Discrimination. Source reconstruction suggested a greater activation of the auditory cortices in the CD group for Individual Face Discrimination. These findings suggest that the experience dependence of the relative contribution of the two hemispheres as well as crossmodal plasticity vary with different aspects of face processing

    Discrimination of fearful and happy body postures in 8-month-old infants: an event-related potential study

    Get PDF
    Responding to others’ emotional body expressions is an essential social skill in humans. Adults readily detect emotions from body postures, but it is unclear whether infants are sensitive to emotional body postures. We examined 8-month-old infants’ brain responses to emotional body postures by measuring event-related potentials (ERPs) to happy and fearful bodies. Our results revealed two emotion-sensitive ERP components: body postures evoked an early N290 at occipital electrodes and a later Nc at fronto-central electrodes that were enhanced in response to fearful (relative to happy) expressions. These findings demonstrate that: (a) 8-month-old infants discriminate between static emotional body postures; and (b) similar to infant emotional face perception, the sensitivity to emotional body postures is reflected in early perceptual (N290) and later attentional (Nc) neural processes. This provides evidence for an early developmental emergence of the neural processes involved in the discrimination of emotional body postures

    The automatic processing of non-verbal emotional vocalizations: an electrophysiological investigation

    Get PDF
    Dissertação de mestrado integrado em Psicologia (ĂĄrea de especialização em Psicologia ClĂ­nica e da SaĂșde)The human voice is a critical channel for the exchange of information about the emotionality of a speaker. In this sense, it is important to investigate the neural correlates of non-verbal vocalizations processing, even when listeners are not attending to these events. We developed an oddball paradigm in which emotional (happy and angry) and neutral vocalizations were presented both as standard and deviant stimuli in four conditions: Happy, Angry, Neutral 1 (neutral vocalizations with angry context), and Neutral 2 (neutral vocalizations with happy context). To unfold the time course of the auditory change detection mechanisms indexed by the Mismatch Negativity (MMN) component, the Event–Related Potentials (ERP) methodology was used. ERPs were recorded in 17 healthy subjects. The results showed that Happy and Neutral 2 conditions elicited more negative MMN amplitude relative to the Angry condition, at midline (Fz, Cz) electrodes. Overall results suggest that automatic auditory change detection is enhanced for positive and neutral (in happy context) vocalizations than for negative stimuli.A voz humana Ă© um canal vital na troca de informação sobre a emocionalidade do outro. Neste sentido, Ă© importante investigar quais os correlatos neuronais associados ao processamento de vocalizaçÔes nĂŁo-verbais, mesmo quando nĂŁo Ă© alocada atenção a estes estĂ­mulos. Foi criado um paradigma oddball com vocalizaçÔes emocionais (alegria e raiva) e neutras, que eram apresentadas como estĂ­mulos frequentes ou infrequentes em quatro condiçÔes distintas: Alegre, Raiva, Neutro 1 (vocalizaçÔes neutras em contexto de raiva) e Neutro 2 (vocalizaçÔes neutras em contexto de alegria). Para investigar o curso temporal dos mecanismos automĂĄticos de detecção de mudança auditiva, foi usada a tĂ©cnica de Potenciais Evocados e estudado o componente Mismatch Negativity (MMN). A amostra foi constituĂ­da por 17 indivĂ­duos saudĂĄveis. Os resultados mostraram que as condiçÔes Alegre e Neutro 2 elicitaram uma amplitude de MMN mais negativa comparativamente com a condição Raiva, para os elĂ©ctrodos situados na linha mĂ©dia do escalpe (Fz, Cz). Estes resultados indicam que existe um mecanismo neuronal de deteção de mudança auditiva mais pronunciado para vocalizaçÔes positivas e neutras (em contexto de alegria) comparativamente com vocalizaçÔes negativas

    Preferential responses to faces in superior temporal and medial prefrontal cortex in three-year-old children

    Get PDF
    Perceiving faces and understanding emotions are key components of human social cognition. Prior research with adults and infants suggests that these social cognitive functions are supported by superior temporal cortex (STC) and medial prefrontal cortex (MPFC). We used functional near-infrared spectroscopy (fNIRS) to characterize functional responses in these cortical regions to faces in early childhood. Three-year-old children (n = 88, M(SD) = 3.15(.16) years) passively viewed faces that varied in emotional content and valence (happy, angry, fearful, neutral) and, for fearful and angry faces, intensity (100%, 40%), while undergoing fNIRS. Bilateral STC and MPFC showed greater oxygenated hemoglobin concentration values to all faces relative to objects. MPFC additionally responded preferentially to happy faces relative to neutral faces. We did not detect preferential responses to angry or fearful faces, or overall differences in response magnitude by emotional valence (100% happy vs. fearful and angry) or intensity (100% vs. 40% fearful and angry). In exploratory analyses, preferential responses to faces in MPFC were not robustly correlated with performance on tasks of early social cognition. These results link and extend adult and infant research on functional responses to faces in STC and MPFC and contribute to the characterization of the neural correlates of early social cognition
    • 

    corecore