26 research outputs found

    Brain Activity Evoked by the Perception of Human Walking: Controlling for Meaningful Coherent Motion

    Get PDF
    Many functional neuroimaging studies of biological motion have used as stimuli point-light displays of walking figures and compared the resulting activations with those evoked by the same display elements moving in a random or noncoherent manner. Although these studies have established that biological motion activates the superior temporal sulcus (STS), the use of random motion controls has left open the possibility that coordinated and meaningful nonbiological motion might activate these same brain regions and thus call into question their specificity for processing biological motion. Here we used functional magnetic resonance imaging and an anatomical region-of-interest approach to test a hierarchy of three questions regarding activity within the STS. First, by comparing responses in the STS with animations of human and robot walking figures, we determined (1) that the STS is sensitive to biological motion itself, not merely to the superficial characteristics of the stimulus. Then we determined that the STS responds more strongly to biological motion (as conveyed by the walking robot) than to (2) a nonmeaningful but complex nonbiological motion (a disjointed mechanical figure) and (3) a complex and meaningful nonbiological motion (the movements of a grandfather clock). In subsequent whole-brain voxel-based analyses, we confirmed robust STS activity that was strongly right lateralized. In addition, we observed significant deactivations in the STS that differentiated biological and nonbiological motion. These voxel-based analyses also revealed regions of motion-related positive activity in other brain regions, including MT or V5, fusiform gyri, right premotor cortex, and the intraparietal sulci

    Category-sensitive excitatory and inhibitory processes in human extrastriate cortex

    No full text
    Single-cell recordings from the temporal lobe of monkeys viewing stimuli show that cells may be highly selective, responding for example to particular objects such as faces. However, stimulus-selective cells may be inhibited by nonpreferred stimuli. Can such inhibitory mechanisms be detected in human visual cortex? In previous recordings from the surface of human ventral extrastriate cortex, we found that specific categories of stimuli such as faces and words generate category-specific negative event-related potentials (ERPs) with a peak latency of about 200 ms (N200). Laminar recordings in animal cortex suggest that the human N200 reflects excitatory depolarizing potentials in apical dendrites of pyramidal cells. In this study we found that, at about half of word-specific N200 sites, faces generated a positive ERP (P200); conversely, at about half of face-specific sites, words generated P200s. The electrogenesis of N200 implies that P200 ERPs reflect hyperpolarizing inhibition of apical dendrites. These recordings, together with the prior animal recordings, provide strong circumstantial evidence that in human cortex populations of cells responsive to one stimulus category (such as faces) inhibit cells responsive to another category (such as words), probably by a type of lateral inhibition. Of the stimulus categories studied quantitatively, face-specific cells are maximally inhibited by words and vice versa, but other categories of stimuli may generate smaller P200s, suggesting that inhibition of category-specific cells by nonpreferred stimuli is a general feature of human extrastriate cortex involved in object recognition.200

    ERPs evoked by viewing facial movements

    No full text
    Human neuroimaging and event-related potential (ERP) studies suggest that ventral and lateral temporo-occipital cortex is sensitive to static faces and face parts. Recent fMRI data also show activation by facial movements. In this study we recorded from 22 posterior scalp locations in 20 normal right-handed males to assess ERPs evoked by viewing: moving eyes and mouths in the context of a face; moving and static eyes with and without facial context. N170 and P350 peak amplitude and latency data were analysed. N170 is an ERP previously shown to be preferentially responsive to face and eye stimuli, and P350 immediately follows N170. Major results were: N170 was significantly larger over the bilateral temporal scalp to viewing opening mouths relative to dosing mouths, and to eye aversion relative to eyes gazing at the observer; at a focal region over the right inferior temporal scalp, N170 was significantly earlier to mouth opening relative to closing, and to eye aversion relative to eyes gazing at the observer; the focal ERP effect of eye aversion occurred independent of facial context; these differences cannot be attributable to movement per se, as they did not occur in a control condition in which checks moved in comparable areas of the visual field; isolated static eyes produced N170s that were not significantly different from N170s to static full faces over the right inferior temporal scalp, unlike in the left hemisphere where face N170s were significantly larger than eye N170s; unlike N170, P350 exhibited nonspecific changes as a function of stimulus movement. These results suggest that: bilateral temporal cortex forms part of a system sensitive to biological motion, of which facial movements form an important subset; there may be a specialised system for facial gesture analysis that provides input for neuronal circuitry dealing with social attention and the actions of others

    Social perception from visual cues: role of the STS region

    No full text
    Social perception refers to initial stages in the processing of information that culminates in the accurate analysis of the dispositions and intentions of other individuals. Single-cell recordings in monkeys, and neurophysiological and neuroimaging studies in humans, reveal that cerebral cortex in and near the superior temporal sulcus (STS) region is an important component of this perceptual system. In monkeys and humans, the STS region is activated by movements of the eyes, mouth, hands and body, suggesting that it is involved in analysis of biological motion. However, it is also activated by static images of the face and body, suggesting that it is sensitive to implied motion and more generally to stimuli that signal the actions of another individual. Subsequent analysis of socially relevant stimuli is carried out in the amygdala and orbitofrontal cortex, which supports a three-structure model proposed by Brothers. The homology of human and monkey areas involved in social perception, and the functional interrelationships between the STS region and the ventral face area, are unresolved issues

    Electrophysiological studies of human face perception III: effects of top-down processing on face-specific potentials

    No full text
    This is the last in a series of papers dealing with intracranial event- related potential (ERP) correlates of face perception. Here we describe the results of manipulations that may exert top-down influences on face recognition and face-specific ERPs, and the effects of cortical stimulation at face-specific sites. Ventral face-specific N200 was not evoked by affective stimuli; showed little or no habituation; was not affected by the familiarity or unfamiliarity of faces; showed no semantic priming; and was not affected by face-name learning or identification. P290 and N700 were affected by semantic priming and by face-name learning and identification. The early fraction of N700 and face-specific P350 exhibited significant habituation. About half of the AP350 sites exhibited semantic priming, whereas the VP350 and LP350 sites did not. Cortical stimulation evoked a transient inability to name familiar faces or evoked face-related hallucinations at two-thirds of face-specific N200 sites. These results are discussed in relation to human behavioral studies and monkey single-cell recordings. Discussion of results of all three papers concludes that: face- specific N200 reflects the operation of a module specialized for the perception of human faces; ventral and lateral occipitotemporal cortex are composed of a complex mosaic of functionally discrete patches of cortex of variable number, size and location; in ventral cortex there is a posterior- to-anterior trend in the location of patches in the order letter-strings, form, hands, objects, faces and face parts; P290 and N700 at face-specific N200 sites, and face-specific P350, are subject to top-down influences

    Electrophysiological studies of human face perception II: response properties of face-specific potentials generated in occipitotemporal cortex

    No full text
    In the previous paper the locations and basic response properties of N200 and other face-specific event-related potentials (ERPs) were described. In this paper responsiveness of N200 and related ERPs to the perceptual features of faces and other images was assessed. N200 amplitude did not vary substantially, whether evoked by colored or grayscale faces; normal, blurred or line-drawing faces; or by faces of different sizes. Human hands evoked small N200s at face-specific sites, but evoked hand-specific ERPs at other sites. Cat and dog faces evoked N200s that were 73% as large as to human faces. Hemifield stimulation demonstrated that the right hemisphere is better at processing information about upright faces and transferring it to the left hemisphere, whereas the left hemisphere is better at processing information about inverted faces and transferring it to the right hemisphere. N200 amplitude was largest to full faces and decreased progressively to eyes, face contours, lips and noses viewed in isolation. A region just lateral to face- specific N200 sites was more responsive to internal face parts than to faces, and some sites in ventral occipitotemporal cortex were face-part-specific. Faces with eyes averted or closed evoked larger N200s than those evoked by faces with eyes forward. N200 amplitude and latency were affected by the joint effects of eye and head position in the right but not in the left hemisphere. Full and three-quarter views of faces evoked larger N200s than did profile views. The results are discussed in relation to behavioral studies in humans and single-cell recordings in monkeys

    Electrophysiological studies of human face perception I: potentials generated in occipitotemporal cortex by face and non-face stimuli

    No full text
    This and the following two papers describe event-related potentials (ERPs) evoked by visual stimuli in 98 patients in whom electrodes were placed directly upon the cortical surface to monitor medically intractable seizures. Patients viewed pictures of faces, scrambled faces, letter-strings, number- strings, and animate and inanimate objects. This paper describes ERPs generated in striate and peristriate cortex, evoked by faces, and evoked by sinusoidal gratings, objects and letter-strings. Short-latency ERPs generated in striate and peristriate cortex were sensitive to elementary stimulus features such as luminance. Three types of face-specific ERPs were found: (i) a surface-negative potential with a peak latency of ~200 ms (N200) recorded from ventral occipitotemporal cortex, (ii) a lateral surface N200 recorded primarily from the middle temporal gyrus, and (iii) a late positive potential (P350) recorded from posterior ventral occipitotemporal, posterior lateral temporal and anterior ventral temporal cortex. Face-specific N200s were preceded by P150 and followed by P290 and N700 ERPs. N200 reflects initial face-specific processing, while P290, N700 and P350 reflect later face processing at or near N200 sites and in anterior ventral temporal cortex. Face-specific N200 amplitude was not significantly different in males and females, in the normal and abnormal hemisphere, or in the right and left hemisphere. However, cortical patches generating ventral face-specific N200s were larger in the right hemisphere. Other cortical patches in the same region of extrastriate cortex generated grating-sensitive N180s and object- specific or letter-string-specific N200s, suggesting that the human ventral object recognition system is segregated into functionally discrete regions
    corecore