4,968 research outputs found

    What does the amygdala contribute to social cognition?

    Get PDF
    The amygdala has received intense recent attention from neuroscientists investigating its function at the molecular, cellular, systems, cognitive, and clinical level. It clearly contributes to processing emotionally and socially relevant information, yet a unifying description and computational account have been lacking. The difficulty of tying together the various studies stems in part from the sheer diversity of approaches and species studied, in part from the amygdala's inherent heterogeneity in terms of its component nuclei, and in part because different investigators have simply been interested in different topics. Yet, a synthesis now seems close at hand in combining new results from social neuroscience with data from neuroeconomics and reward learning. The amygdala processes a psychological stimulus dimension related to saliency or relevance; mechanisms have been identified to link it to processing unpredictability; and insights from reward learning have situated it within a network of structures that include the prefrontal cortex and the ventral striatum in processing the current value of stimuli. These aspects help to clarify the amygdala's contributions to recognizing emotion from faces, to social behavior toward conspecifics, and to reward learning and instrumental behavior

    Decoding face categories in diagnostic subregions of primary visual cortex

    Get PDF
    Higher visual areas in the occipitotemporal cortex contain discrete regions for face processing, but it remains unclear if V1 is modulated by top-down influences during face discrimination, and if this is widespread throughout V1 or localized to retinotopic regions processing task-relevant facial features. Employing functional magnetic resonance imaging (fMRI), we mapped the cortical representation of two feature locations that modulate higher visual areas during categorical judgements – the eyes and mouth. Subjects were presented with happy and fearful faces, and we measured the fMRI signal of V1 regions processing the eyes and mouth whilst subjects engaged in gender and expression categorization tasks. In a univariate analysis, we used a region-of-interest-based general linear model approach to reveal changes in activation within these regions as a function of task. We then trained a linear pattern classifier to classify facial expression or gender on the basis of V1 data from ‘eye’ and ‘mouth’ regions, and from the remaining non-diagnostic V1 region. Using multivariate techniques, we show that V1 activity discriminates face categories both in local ‘diagnostic’ and widespread ‘non-diagnostic’ cortical subregions. This indicates that V1 might receive the processed outcome of complex facial feature analysis from other cortical (i.e. fusiform face area, occipital face area) or subcortical areas (amygdala)

    Does gaze direction modulate facial expression processing in children with autism spectrum disorder?

    Get PDF
    Two experiments investigated whether children with autism spectrum disorder (ASD) integrate relevant communicative signals, such as gaze direction, when decoding a facial expression. In Experiment 1, typically developing children (9–14 years old; n = 14) were faster at detecting a facial expression accompanying a gaze direction with a congruent motivational tendency (i.e., an avoidant facial expression with averted eye gaze) than those with an incongruent motivational tendency. Children with ASD (9–14 years old; n = 14) were not affected by the gaze direction of facial stimuli. This finding was replicated in Experiment 2, which presented only the eye region of the face to typically developing children (n = 10) and children with ASD (n = 10). These results demonstrated that children with ASD do not encode and/or integrate multiple communicative signals based on their affective or motivational tendency

    The eye contact effect: mechanisms and development

    Get PDF
    The ‘eye contact effect’ is the phenomenon that perceived eye contact with another human face modulates certain aspects of the concurrent and/or immediately following cognitive processing. In addition, functional imaging studies in adults have revealed that eye contact can modulate activity in structures in the social brain network, and developmental studies show evidence for preferential orienting towards, and processing of, faces with direct gaze from early in life. We review different theories of the eye contact effect and advance a ‘fast-track modulator’ model. Specifically, we hypothesize that perceived eye contact is initially detected by a subcortical route, which then modulates the activation of the social brain as it processes the accompanying detailed sensory information

    Amygdala reactivity predicts adolescent antisocial behavior but not callous-unemotional traits.

    Get PDF
    Recent neuroimaging studies have suggested divergent relationships between antisocial behavior (AB) and callous-unemotional (CU) traits and amygdala reactivity to fearful and angry facial expressions in adolescents. However, little work has examined if these findings extend to dimensional measures of behavior in ethnically diverse, non-clinical samples, or if participant sex, ethnicity, pubertal stage, and age moderate associations. We examined links between amygdala reactivity and dimensions of AB and CU traits in 220 Hispanic and non-Hispanic Caucasian adolescents (age 11-15; 49.5% female; 38.2% Hispanic), half of whom had a family history for depression and thus were at relatively elevated risk for late starting, emotionally dysregulated AB. We found that AB was significantly related to increased right amygdala reactivity to angry facial expressions independent of sex, ethnicity, pubertal stage, age, and familial risk status for depression. CU traits were not related to fear- or anger-related amygdala reactivity. The present study further demonstrates that AB is related to increased amygdala reactivity to interpersonal threat cues in adolescents, and that this relationship generalizes across sex, ethnicity, pubertal stage, age, and familial risk status for depression

    Fearful faces have a sensory advantage in the competition for awareness

    Get PDF
    Only a subset of visual signals give rise to a conscious percept. Threat signals, such as fearful faces, are particularly salient to human vision. Research suggests that fearful faces are evaluated without awareness and preferentially promoted to conscious perception. This agrees with evolutionary theories that posit a dedicated pathway specialized in processing threat-relevant signals. We propose an alternative explanation for this "fear advantage." Using psychophysical data from continuous flash suppression (CFS) and masking experiments, we demonstrate that awareness of facial expressions is predicted by effective contrast: the relationship between their Fourier spectrum and the contrast sensitivity function. Fearful faces have higher effective contrast than neutral expressions and this, not threat content, predicts their enhanced access to awareness. Importantly, our findings do not support the existence of a specialized mechanism that promotes threatening stimuli to awareness. Rather, our data suggest that evolutionary or learned adaptations have molded the fearful expression to exploit our general-purpose sensory mechanisms

    Temporal precedence of emotion over attention modulations in the lateral amygdala: Intracranial ERP evidence from a patient with temporal lobe epilepsy

    Get PDF
    Previous fMRI studies have reported mixed evidence for the influence of selective attention on amygdala responses to emotional stimuli, with some studies showing "automatic" emotional effects to threat-related stimuli without attention (or even without awareness), but other studies showing a gating of amygdala activity by selective attention with no response to unattended stimuli. We recorded intracranial local field potentials from the intact left lateral amygdala in a human patient prior to surgery for epilepsy and tested, with a millisecond time resolution, for neural responses to fearful faces appearing at either task-relevant or task-irrelevant locations. Our results revealed an early emotional effect in the amygdala arising prior to, and independently of, attentional modulation. However, at a later latency, we found a significant modulation of the differential emotional response when attention was directed toward or away from fearful faces. These results suggest separate influences of emotion and attention on amygdala activation and may help reconcile previous discrepancies concerning the relative responsiveness of the human amygdala to emotional and attentional factors

    Eye contact facilitates awareness of faces during interocular suppression

    Get PDF
    Eye contact captures attention and receives prioritized visual processing. Here we asked whether eye contact might be processed outside conscious awareness. Faces with direct and averted gaze were rendered invisible using interocular suppression. In two experiments we found that faces with direct gaze overcame such suppression more rapidly than faces with averted gaze. Control experiments ruled out the influence of low-level stimulus differences and differential response criteria. These results indicate an enhanced unconscious representation of direct gaze, enabling the automatic and rapid detection of other individuals making eye contact with the observer

    Atypical eye contact in autism: Models, mechanisms and development

    Get PDF
    An atypical pattern of eye contact behaviour is one of the most significant symptoms of Autism Spectrum Disorder (ASD). Recent empirical advances have revealed the developmental, cognitive and neural basis of atypical eye contact behaviour in ASD. We review different models and advance a new ‘fast-track modulator model’. Specifically, we propose that atypical eye contact processing in ASD originates in the lack of influence from a subcortical face and eye contact detection route, which is hypothesized to modulate eye contact processing and guide its emergent specialization during development

    Fear and the human amygdala

    Get PDF
    We have previously reported that bilateral amygdala damage in humans compromises the recognition of fear in facial expressions while leaving intact recognition of face identity (Adolphs et al., 1994). The present study aims at examining questions motivated by this finding. We addressed the possibility that unilateral amygdala damage might be sufficient to impair recognition of emotional expressions. We also obtained further data on our subject with bilateral amygdala damage, in order to elucidate possible mechanisms that could account for the impaired recognition of expressions of fear. The results show that bilateral, but not unilateral, damage to the human amygdala impairs the processing of fearful facial expressions. This impairment appears to result from an insensitivity to the intensity of fear expressed by faces. We also confirmed a double dissociation between the recognition of facial expressions of fear, and the recognition of identity of a face: these two processes can be impaired independently, lending support to the idea that they are subserved in part by anatomically separate neural systems. Based on our data, and on what is known about the amygdala's connectivity, we propose that the amygdala is required to link visual representations of facial expressions, on the one hand, with representations that constitute the concept of fear, on the other. Preliminary data suggest the amygdala's role extends to both recognition and recall of fearful facial expressions
    corecore