7 research outputs found

    Species sensitivity of early face and eye processing

    Get PDF
    The final publication is available at Elsevier via http://dx.doi.org/10.1016/j.neuroimage.2010.07.031. © 2011. This manuscript version is made available under the CC-BY-NC-ND 4.0 license http://creativecommons.org/licenses/by-nc-nd/4.0/Humans are better at recognizing human faces than faces of other species. However, it is unclear whether this species sensitivity can be seen at early perceptual stages of face processing and whether it involves species sensitivity for important facial features like the eyes. These questions were addressed by comparing the modulations of the N170 ERP component to faces, eyes and eyeless faces of humans, apes, cats and dogs, presented upright and inverted. Although all faces and isolated eyes yielded larger responses than the control object category (houses), the N170 was shorter and smaller to human than animal faces and larger to human than animal eyes. Most importantly, while the classic inversion effect was found for human faces, animal faces yielded no inversion effect or an opposite inversion effect, as seen for objects, suggesting a different neural process involved for humans faces compared to faces of other species. Thus, in addition to its general face and eye categorical sensitivity, the N170 appears particularly sensitive to the human species for both faces and eyes. The results are discussed in the context of a recent model of the N170 response involving face and eye sensitive neurons (Itier et al., 2007) where the eyes play a central role in face perception. The data support the intuitive idea that eyes are what make animal head fronts look face-like and that proficiency for the human species involves visual expertise for the human eyes

    Face learning via brief real-world social interactions includes changes in face-selective brain areas and hippocampus

    Get PDF
    Making new acquaintances requires learning to recognise previously unfamiliar faces. In the current study, we investigated this process by staging real-world social interactions between actors and the participants. Participants completed a face-matching behavioural task in which they matched photographs of the actors (whom they had yet to meet), or faces similar to the actors (henceforth called foils). Participants were then scanned using functional magnetic resonance imaging (fMRI) while viewing photographs of actors and foils. Immediately after exiting the scanner, participants met the actors for the first time and interacted with them for 10 min. On subsequent days, participants completed a second behavioural experiment and then a second fMRI scan. Prior to each session, actors again interacted with the participants for 10 min. Behavioural results showed that social interactions improved performance accuracy when matching actor photographs, but not foil photographs. The fMRI analysis revealed a difference in the neural response to actor photographs and foil photographs across all regions of interest (ROIs) only after social interactions had occurred. Our results demonstrate that short social interactions were sufficient to learn and discriminate previously unfamiliar individuals. Moreover, these learning effects were present in brain areas involved in face processing and memory

    Modulation de l'apprentissage visuel par stimulation électrique transcrânienne à courant direct du cortex préfrontal

    Full text link
    Le traitement visuel répété d’un visage inconnu entraîne une suppression de l’activité neuronale dans les régions préférentielles aux visages du cortex occipito-temporal. Cette «suppression neuronale» (SN) est un mécanisme primitif hautement impliqué dans l’apprentissage de visages, pouvant être détecté par une réduction de l’amplitude de la composante N170, un potentiel relié à l’événement (PRE), au-dessus du cortex occipito-temporal. Le cortex préfrontal dorsolatéral (CPDL) influence le traitement et l’encodage visuel, mais sa contribution à la SN de la N170 demeure inconnue. Nous avons utilisé la stimulation électrique transcrânienne à courant direct (SETCD) pour moduler l’excitabilité corticale du CPDL de 14 adultes sains lors de l’apprentissage de visages inconnus. Trois conditions de stimulation étaient utilisées: inhibition à droite, excitation à droite et placebo. Pendant l’apprentissage, l’EEG était enregistré afin d’évaluer la SN de la P100, la N170 et la P300. Trois jours suivant l’apprentissage, une tâche de reconnaissance était administrée où les performances en pourcentage de bonnes réponses et temps de réaction (TR) étaient enregistrées. Les résultats indiquent que la condition d’excitation à droite a facilité la SN de la N170 et a augmentée l’amplitude de la P300, entraînant une reconnaissance des visages plus rapide à long-terme. À l’inverse, la condition d’inhibition à droite a causé une augmentation de l’amplitude de la N170 et des TR plus lents, sans affecter la P300. Ces résultats sont les premiers à démontrer que la modulation d’excitabilité du CPDL puisse influencer l’encodage visuel de visages inconnus, soulignant l’importance du CPDL dans les mécanismes d’apprentissage de base.Repeated visual processing of an unfamiliar face suppresses neural activity in face-specific areas of the occipito-temporal cortex. This "repetition suppression" (RS) is a primitive mechanism involved in learning of unfamiliar faces, which can be detected through amplitude reduction of the N170 event-related potential (ERP). The dorsolateral prefrontal cortex (DLPFC) exerts top-down influence on early visual processing. However, its contribution to N170 RS and learning of unfamiliar faces remains unclear. Transcranial direct current stimulation (tDCS) transiently increases or decreases cortical excitability, as a function of polarity. We hypothesized that DLPFC excitability modulation by tDCS would cause polarity-dependent modulations of N170 RS during encoding of unfamiliar faces. tDCS-induced N170 RS enhancement would improve long-term recognition reaction time (RT) and/or accuracy rates, whereas N170 RS impairment would compromise recognition ability. Participants underwent three tDCS conditions in random order at ~72 hour intervals: right anodal/left cathodal, right cathodal/left anodal and sham. Immediately following tDCS conditions, an EEG was recorded during encoding of unfamiliar faces for assessment of P100 and N170 visual ERPs. P300 was analyzed to detect prefrontal function modulation. Recognition tasks were administered ~72 hours following encoding. Results indicate the right anodal/left cathodal condition facilitated N170 RS and induced larger P300 amplitudes, leading to faster recognition RT. Conversely, the right cathodal/left anodal condition caused increases in N170 amplitudes and RT, but did not affect P300. These data are the first to demonstrate that DLPFC excitability modulation can influence early visual encoding of unfamiliar faces, highlighting the importance of DLPFC in basic learning mechanisms

    Effects of Conceptual Categorization on Early Visual Processing

    Get PDF
    Ph.DDOCTOR OF PHILOSOPH

    The role played by language in the interpretation of emotional facial expressions

    Get PDF
    This thesis examines the role played by language in the interpretation of emotional expression. Language labels may indirectly influence such tasks, organising and reactivating a useful repository of semantic knowledge (e.g. Barrett, Lindquist & Gendron, 2007). This proposal was explored using a series of semantic satiation experiments (Lindquist, Barrett, Bliss-Moreau & Russell, 2006). Participants repeated words 3 or 30 times before deciding whether two faces matched in emotional expression. Word type was manipulated across experiments (emotion labels, neutral labels and non-words); an indirect account would only predict reduced accuracy when participants experience semantic inaccessibility, achieved via massed repetition of an emotional label. However, reduced discrimination was found both after 30 (vs. 3) repetitions of any word, and two non-linguistic activities. Findings then suggest that the massed repetition decrement arises via a non-semantic mechanism, such as response uncertainty (e.g. Tian & Huber, 2010). However, an emotion-specific effect of language was also consistently observed. Participants showed facilitated performance after 3 and 30 repetitions of an emotion label, but only when it matched both expressions in the pair. This may suggest that language labels directly influence early emotion perception (Lupyan, 2007, 2012), or provide strategic support during paired discrimination (e.g. Roberson & Davidoff, 2000). A perceptual threshold procedure was used to test the direct assumption. Participants repeated an emotion or neutral label before deciding whether a briefly presented face did, or did not, display an emotional expression. In comparison to the neutral baseline, participants showed no facilitation in performance following exposure to emotion labels that were ‘weakly’ or ‘strongly’ congruent with the subsequently presented expression. Overall, findings inconsistently support the notion that language shapes the interpretation of emotional expression. This prompts discussion of how task demands may influence language-driven recruitment of conceptual knowledge, and the time-course across which these linked elements influence interpretation
    corecore