17 research outputs found

    Faces in Motion: Selectivity of Macaque and Human Face Processing Areas for Dynamic Stimuli

    Get PDF
    Face recognition mechanisms need to extract information from static and dynamic faces. It has been hypothesized that the analysis of dynamic face attributes is performed by different face areas than the analysis of static facial attributes. To date, there is no evidence for such a division of labor in macaque monkeys. We used fMRI to determine specializations of macaque face areas for motion. Face areas in the fundus of the superior temporal sulcus responded to general object motion; face areas outside of the superior temporal sulcus fundus responded more to facial motion than general object motion. Thus, the macaque face-processing system exhibits regional specialization for facial motion. Human face areas, processing the same stimuli, exhibited specializations for facial motion as well. Yet the spatial patterns of facial motion selectivity differed across species, suggesting that facial dynamics are analyzed differently in humans and macaques

    Auditory connections and functions of prefrontal cortex

    No full text
    The functional auditory system extends from the ears to the frontal lobes with successively more complex functions occurring as one ascends the hierarchy of the nervous system. Several areas of the frontal lobe receive afferents from both early and late auditory processing regions within the temporal lobe. Afferents from the early part of the cortical auditory system, the auditory belt cortex, which are presumed to carry information regarding auditory features of sounds, project to only a few prefrontal regions and are most dense in the ventrolateral prefrontal cortex (VLPFC). In contrast, projections from the parabelt and the rostral superior temporal gyrus (STG) most likely convey more complex information and target a larger, widespread region of the prefrontal cortex. Neuronal responses reflect these anatomical projections as some prefrontal neurons exhibit responses to features in acoustic stimuli, while other neurons display task-related responses. For example, recording studies in non-human primates indicate that VLPFC is responsive to complex sounds including vocalizations and that VLPFC neurons in area 12/47 respond to sounds with similar acoustic morphology. In contrast, neuronal responses during auditory working memory involve a wider region of the prefrontal cortex. In humans, the frontal lobe is involved in auditory detection, discrimination, and working memory. Past research suggests that dorsal and ventral subregions of the prefrontal cortex process different types of information with dorsal cortex processing spatial/visual information and ventral cortex processing non-spatial/auditory information. While this is apparent in the non-human primate and in some neuroimaging studies, most research in humans indicates that specific task conditions, stimuli or previous experience may bias the recruitment of specific prefrontal regions, suggesting a more flexible role for the frontal lobe during auditory cognition

    Neuronal Activity and Connections of Face and Vocalization Processing Regions of the Primate Prefrontal Cortex

    No full text
    Thesis (Ph.D.)--University of Rochester. School of Medicine & Dentistry. Dept. of Neuroscience, 2012.The prefrontal cortex receives a wealth of sensory signals and plays an essential role in orchestrating complex cognitive behaviors including social communication processes. Social communication requires the integration of different environmental cues that are present in faces and voices. The goal of this thesis was to examine how the ventral frontal lobe encodes social communication information. Evidence has suggested that the ventral frontal lobe is involved in encoding and remembering complex features of objects including faces and voices. We obtain a variety of information from faces and voices including the speaker’s identity, emotional state, as well as their intentions. However, little is known to what extent the ventral frontal lobe encodes these features. In the first set of experiments, we examined the encoding of stimulus congruence in multisensory prefrontal neurons. Previous studies have shown that neurons in ventrolateral prefrontal cortex (VLPFC) integrate facial and vocal stimuli. To understand the specific features required for proper recognition and integration, we examined the responses of VLPFC neurons to incongruent face-vocalization stimuli. VLPFC recordings revealed that some multisensory neurons are sensitive to stimulus congruence, which was demonstrated with either enhanced or suppressed neuronal response patterns. In another series of recordings, we assessed the role of VLPFC neurons in encoding and discriminating audiovisual expressions and speaker identity. Neuronal activity was recorded from the VLPFC of macaque monkeys while they performed a non-match to sample task using conspecific face-vocalization stimuli that differed by emotional expression or caller identity. These results indicate that many VLPFC neurons respond to both task and stimulus-related events including changes in identity or changes in expression that occurred between the audiovisual vocalizations. Finally, we characterized the anatomical connectivity of VLPFC regions that had been physiologically defined as auditory, visual, or multisensory. These anatomical studies revealed direct connections between auditory, visual, and audiovisual-responsive areas of VLPFC with auditory association, visual extrastriate, and polymodal superior temporal cortical regions. Together, these findings will help elucidate the neuronal mechanisms and circuits of social communication which will aid in the advancement of treatments for communication and affective disorders including autism and schizophrenia

    Face and vocalization processing in the primate ventrolateral prefrontal cortex

    No full text
    Thesis (Ph. D.)--University of Rochester. Dept. of Brain & Cognitive Sciences, 2012.Faces and voices are important stimuli for social communication and interaction. It has been known that there is a close relationship between face and voice processing, yet it remains unknown how faces and voices are combined at the level of a single neuron while being processed in the brain. The ventrolateral prefrontal cortex (VLPFC) is one of the brain areas in which face-vocalization integration occurs. Because of its location and anatomical connections, this area has been noted as a possible homologue of human language areas in the inferior frontal gyrus, but there is little known about the cellular activity in this area during vocal stimulus processing or during face-vocalization integration. To investigate the details of audiovisual face-vocalization integration and the unique role of VLPFC, we trained monkey subjects with two audiovisual nonmatch-to-sample tasks in which species-specific faces and vocalizations were used, and recorded neural activity in VLPFC. From Experiment I and II, we found that some neurons modulate their activity depending on the particular changes in an audiovisual pair to be discriminated. In Experiment III and IV, we further examined the unique association of face and vocalization stimuli, by examining the effect of irrelevant face and non-face stimuli on vocalization discrimination performance. Interestingly the vocalization discrimination performance of some subjects was improved when face stimuli were presented together with vocalizations. We also investigated the time course of auditory and visual processing in VLPFC neurons. Although auditory processing is faster than visual processing, voices/vocalizations are preceded by facial motion due to the difference of their onset time during normal speech or vocalization production. This external difference of stimulus timing compensated for the difference of internal processing time and resulted in an earlier rise of visual information in VLPFC
    corecore