5,231 research outputs found

    Preserved Haptic Shape Processing after Bilateral LOC Lesions.

    Get PDF
    UNLABELLED: The visual and haptic perceptual systems are understood to share a common neural representation of object shape. A region thought to be critical for recognizing visual and haptic shape information is the lateral occipital complex (LOC). We investigated whether LOC is essential for haptic shape recognition in humans by studying behavioral responses and brain activation for haptically explored objects in a patient (M.C.) with bilateral lesions of the occipitotemporal cortex, including LOC. Despite severe deficits in recognizing objects using vision, M.C. was able to accurately recognize objects via touch. M.C.\u27s psychophysical response profile to haptically explored shapes was also indistinguishable from controls. Using fMRI, M.C. showed no object-selective visual or haptic responses in LOC, but her pattern of haptic activation in other brain regions was remarkably similar to healthy controls. Although LOC is routinely active during visual and haptic shape recognition tasks, it is not essential for haptic recognition of object shape. SIGNIFICANCE STATEMENT: The lateral occipital complex (LOC) is a brain region regarded to be critical for recognizing object shape, both in vision and in touch. However, causal evidence linking LOC with haptic shape processing is lacking. We studied recognition performance, psychophysical sensitivity, and brain response to touched objects, in a patient (M.C.) with extensive lesions involving LOC bilaterally. Despite being severely impaired in visual shape recognition, M.C. was able to identify objects via touch and she showed normal sensitivity to a haptic shape illusion. M.C.\u27s brain response to touched objects in areas of undamaged cortex was also very similar to that observed in neurologically healthy controls. These results demonstrate that LOC is not necessary for recognizing objects via touch

    Are Visual Texture-selective Areas Recruited During Haptic Texture Discrimination?

    Get PDF
    Shape and texture provide cues to object identity, both when objects are explored using vision and via touch (haptics). Visual shape information is processed within the lateral occipital complex (LOC), while texture is processed in medial regions of the collateral sulcus (CoS). Evidence indicates that the LOC is consistently recruited during both visual and haptic shape processing. We used functional magnetic resonance imaging (fMRI) to examine whether \u27visual\u27 texture-selective areas are similarly recruited when observers discriminate texture via touch. We used a blocked design in which participants attended to either the texture or shape of a number of 3-dimensional (3D) objects, via vision or touch. We observed significant haptic texture-selective fMRI responses in medial occipitotemporal cortex within areas adjacent to, but not overlapping, those recruited during visual texture discrimination. Our data demonstrate that occipitotemporal areas typically associated with visual processing are recruited during the perception of surface texture via touch

    Bodily awareness and novel multisensory features

    Get PDF
    According to the decomposition thesis, perceptual experiences resolve without remainder into their different modality-specific components. Contrary to this view, I argue that certain cases of multisensory integration give rise to experiences representing features of a novel type. Through the coordinated use of bodily awareness—understood here as encompassing both proprioception and kinaesthesis—and the exteroceptive sensory modalities, one becomes perceptually responsive to spatial features whose instances couldn’t be represented by any of the contributing modalities functioning in isolation. I develop an argument for this conclusion focusing on two cases: 3D shape perception in haptic touch and experiencing an object’s egocentric location in crossmodally accessible, environmental space

    Decoding visual object categories in early somatosensory cortex

    Get PDF
    Neurons, even in the earliest sensory areas of cortex, are subject to a great deal of contextual influence from both within and across modality connections. In the present work, we investigated whether the earliest regions of somatosensory cortex (S1 and S2) would contain content-specific information about visual object categories. We reasoned that this might be possible due to the associations formed through experience that link different sensory aspects of a given object. Participants were presented with visual images of different object categories in 2 fMRI experiments. Multivariate pattern analysis revealed reliable decoding of familiar visual object category in bilateral S1 (i.e., postcentral gyri) and right S2. We further show that this decoding is observed for familiar but not unfamiliar visual objects in S1. In addition, whole-brain searchlight decoding analyses revealed several areas in the parietal lobe that could mediate the observed context effects between vision and somatosensation. These results demonstrate that even the first cortical stages of somatosensory processing carry information about the category of visually presented familiar objects

    Perception of surface stickiness in different sensory modalities: an functional MRI study

    Get PDF
    Surface texture can be perceived not only from tactile, but also from auditory and visual sensory cues. In our previous psychophysical study, we demonstrated that humans can recognize surface stickiness using only one kind of sensory modality without any difficulty. However, the brain regions that would be activated by non-corresponding sensory cues, for example, auditory and visual cues, remain unknown. In this human functional MRI study, we explored brain regions associated with surface stickiness perception in each of three different sensory modalities, and sought for common neural activities across modalities. In the tactile condition, participants actually touched a sticky surface with their right index finger. In the auditory and visual conditions, audio and video clips of tactile explorations of a sticky surface were presented and participants were asked to recall the perceived stickiness as vividly as possible. Our results, based on a general linear model analysis, showed that somatosensory cortices including postcentral gyrus, anterior insula, and anterior intraparietal sulcus were significantly activated across all modalities. Moreover, we observed significant activation of primary sensory regions of each modality. A follow-up conjunction analysis identified that postcentral gyrus, anterior intraparietal sulcus, precentral gyrus, and supplementary motor area were activated in common. These findings could deepen our understanding of the surface stickiness perception in the human brain

    Editorial: Perceiving and Acting in the real world: from neural activity to behavior

    Get PDF
    The interaction between perception and action represents one of the pillars of human evolutionary success. Our interactions with the surrounding world involve a variety of behaviors, almost always including movements of the eyes and hands. Such actions rely on neural mechanisms that must process an enormous amount of information in order to generate appropriate motor commands. Yet, compared to the great advancements in the field of perception for cognition, the neural underpinnings of how we control our movements, as well as the interactions between perception and motor control, remain elusive. With this research topic we provide a framework for: 1) the perception of real objects and shapes using visual and haptic information, 2) the reference frames for action and perception, and 3) how perceived target properties are translated into goal-directed actions and object manipulation. The studies in this special issue employ a variety of methodologies that include behavioural kinematics, neuroimaging, transcranial magnetic stimulation and patient cases. Here we provide a brief summary and commentary on the articles included in this research topic

    Change blindness: eradication of gestalt strategies

    Get PDF
    Arrays of eight, texture-defined rectangles were used as stimuli in a one-shot change blindness (CB) task where there was a 50% chance that one rectangle would change orientation between two successive presentations separated by an interval. CB was eliminated by cueing the target rectangle in the first stimulus, reduced by cueing in the interval and unaffected by cueing in the second presentation. This supports the idea that a representation was formed that persisted through the interval before being 'overwritten' by the second presentation (Landman et al, 2003 Vision Research 43149–164]. Another possibility is that participants used some kind of grouping or Gestalt strategy. To test this we changed the spatial position of the rectangles in the second presentation by shifting them along imaginary spokes (by ±1 degree) emanating from the central fixation point. There was no significant difference seen in performance between this and the standard task [F(1,4)=2.565, p=0.185]. This may suggest two things: (i) Gestalt grouping is not used as a strategy in these tasks, and (ii) it gives further weight to the argument that objects may be stored and retrieved from a pre-attentional store during this task

    Young children do not integrate visual and haptic information

    Get PDF
    Several studies have shown that adults integrate visual and haptic information (and information from other modalities) in a statistically optimal fashion, weighting each sense according to its reliability. To date no studies have investigated when this capacity for cross-modal integration develops. Here we show that prior to eight years of age, integration of visual and haptic spatial information is far from optimal, with either vision or touch dominating totally, even in conditions where the dominant sense is far less precise than the other (assessed by discrimination thresholds). For size discrimination, haptic information dominates in determining both perceived size and discrimination thresholds, while for orientation discrimination vision dominates. By eight-ten years, the integration becomes statistically optimal, like adults. We suggest that during development, perceptual systems require constant recalibration, for which cross-sensory comparison is important. Using one sense to calibrate the other precludes useful combination of the two sources
    corecore