28 research outputs found

    Intentional Maps in Posterior Parietal Cortex

    Get PDF
    The posterior parietal cortex (PPC), historically believed to be a sensory structure, is now viewed as an area important for sensory-motor integration. Among its functions is the forming of intentions, that is, high-level cognitive plans for movement. There is a map of intentions within the PPC, with different subregions dedicated to the planning of eye movements, reaching movements, and grasping movements. These areas appear to be specialized for the multisensory integration and coordinate transformations required to convert sensory input to motor output. In several subregions of the PPC, these operations are facilitated by the use of a common distributed space representation that is independent of both sensory input and motor output. Attention and learning effects are also evident in the PPC. However, these effects may be general to cortex and operate in the PPC in the context of sensory-motor transformations

    The Proprioceptive Map of the Arm Is Systematic and Stable, but Idiosyncratic

    Get PDF
    Visual and somatosensory signals participate together in providing an estimate of the hand's spatial location. While the ability of subjects to identify the spatial location of their hand based on visual and proprioceptive signals has previously been characterized, relatively few studies have examined in detail the spatial structure of the proprioceptive map of the arm. Here, we reconstructed and analyzed the spatial structure of the estimation errors that resulted when subjects reported the location of their unseen hand across a 2D horizontal workspace. Hand position estimation was mapped under four conditions: with and without tactile feedback, and with the right and left hands. In the task, we moved each subject's hand to one of 100 targets in the workspace while their eyes were closed. Then, we either a) applied tactile stimulation to the fingertip by allowing the index finger to touch the target or b) as a control, hovered the fingertip 2 cm above the target. After returning the hand to a neutral position, subjects opened their eyes to verbally report where their fingertip had been. We measured and analyzed both the direction and magnitude of the resulting estimation errors. Tactile feedback reduced the magnitude of these estimation errors, but did not change their overall structure. In addition, the spatial structure of these errors was idiosyncratic: each subject had a unique pattern of errors that was stable between hands and over time. Finally, we found that at the population level the magnitude of the estimation errors had a characteristic distribution over the workspace: errors were smallest closer to the body. The stability of estimation errors across conditions and time suggests the brain constructs a proprioceptive map that is reliable, even if it is not necessarily accurate. The idiosyncrasy across subjects emphasizes that each individual constructs a map that is unique to their own experiences

    Integration of target and hand position signals in the posterior parietal cortex: effects of workspace and hand vision

    No full text
    Previous findings suggest the posterior parietal cortex (PPC) contributes to arm movement planning by transforming target and limb position signals into a desired reach vector. However, the neural mechanisms underlying this transformation remain unclear. In the present study we examined the responses of 109 PPC neurons as movements were planned and executed to visual targets presented over a large portion of the reaching workspace. In contrast to previous studies, movements were made without concurrent visual and somatic cues about the starting position of the hand. For comparison, a subset of neurons was also examined with concurrent visual and somatic hand position cues. We found that single cells integrated target and limb position information in a very consistent manner across the reaching workspace. Approximately two-thirds of the neurons with significantly tuned activity (42/61 and 30/46 for left and right workspaces, respectively) coded targets and initial hand positions separably, indicating no hand-centered encoding, whereas the remaining one-third coded targets and hand positions inseparably, in a manner more consistent with the influence of hand-centered coordinates. The responses of both types of neurons were largely invariant with respect to the presence or absence of visual hand position cues, suggesting their corresponding coordinate frames and gain effects were unaffected by cue integration. The results suggest that the PPC uses a consistent scheme for computing reach vectors in different parts of the workspace that is robust to changes in the availability of somatic and visual cues about hand position

    Direct visuomotor transformations for reaching

    No full text
    The posterior parietal cortex (PPC) is thought to have a function in the sensorimotor transformations that underlie visually guided reaching, as damage to the PPC can result in difficulty reaching to visual targets in the absence of specific visual or motor deficits. This function is supported by findings that PPC neurons in monkeys are modulated by the direction of hand movement, as well as by visual, eye position and limb position signals. The PPC could transform visual target locations from retinal coordinates to hand-centred coordinates by combining sensory signals in a serial manner to yield a body-centred representation of target location, and then subtracting the body-centred location of the hand. We report here that in dorsal area 5 of the PPC, remembered target locations are coded with respect to both the eye and hand. This suggests that the PPC transforms target locations directly between these two reference frames. Data obtained in the adjacent parietal reach region (PRR) indicate that this transformation may be achieved by vectorially subtracting hand location from target location, with both locations represented in eye-centred coordinates

    Reach Plans in Eye-Centered Coordinates

    No full text
    The neural events associated with visually guided reaching begin with an image on the retina and end with impulses to the muscles. In between, a reaching plan is formed. This plan could be in the coordinates of the arm, specifying the direction and amplitude of the movement, or it could be in the coordinates of the eye because visual information is initially gathered in this reference frame. In a reach-planning area of the posterior parietal cortex, neural activity was found to be more consistent with an eye-centered than an arm-centered coding of reach targets. Coding of arm movements in an eye-centered reference frame is advantageous because obstacles that affect planning as well as errors in reaching are registered in this reference frame. Also, eye movements are planned in eye coordinates, and the use of similar coordinates for reaching may facilitate hand-eye coordination

    Multisensory Interactions Influence Neuronal Spike Train Dynamics in the Posterior Parietal Cortex.

    No full text
    Although significant progress has been made in understanding multisensory interactions at the behavioral level, their underlying neural mechanisms remain relatively poorly understood in cortical areas, particularly during the control of action. In recent experiments where animals reached to and actively maintained their arm position at multiple spatial locations while receiving either proprioceptive or visual-proprioceptive position feedback, multisensory interactions were shown to be associated with reduced spiking (i.e. subadditivity) as well as reduced intra-trial and across-trial spiking variability in the superior parietal lobule (SPL). To further explore the nature of such interaction-induced changes in spiking variability we quantified the spike train dynamics of 231 of these neurons. Neurons were classified as Poisson, bursty, refractory, or oscillatory (in the 13-30 Hz "beta-band") based on their spike train power spectra and autocorrelograms. No neurons were classified as Poisson-like in either the proprioceptive or visual-proprioceptive conditions. Instead, oscillatory spiking was most commonly observed with many neurons exhibiting these oscillations under only one set of feedback conditions. The results suggest that the SPL may belong to a putative beta-synchronized network for arm position maintenance and that position estimation may be subserved by different subsets of neurons within this network depending on available sensory information. In addition, the nature of the observed spiking variability suggests that models of multisensory interactions in the SPL should account for both Poisson-like and non-Poisson variability
    corecore