34 research outputs found

    Reference frames for reaching when decoupling eye and target position in depth and direction

    Get PDF
    Spatial representations in cortical areas involved in reaching movements were traditionally studied in a frontoparallel plane where the two-dimensional target location and the movement direction were the only variables to consider in neural computations. No studies so far have characterized the reference frames for reaching considering both depth and directional signals. Here we recorded from single neurons of the medial posterior parietal area V6A during a reaching task where fixation point and reaching targets were decoupled in direction and depth. We found a prevalent mixed encoding of target position, with eye-centered and spatiotopic representations differently balanced in the same neuron. Depth was stronger in defining the reference frame of eye-centered cells, while direction was stronger in defining that of spatiotopic cells. The predominant presence of various typologies of mixed encoding suggests that depth and direction signals are processed on the basis of flexible coordinate systems to ensure optimal motor response

    Eye-movements intervening between two successive sounds disrupt comparisons of auditory location

    No full text
    Many studies have investigated how saccades may affect the internal representation of visual locations across eye-movements. Here, we studied, instead, whether eye-movements can affect auditory spatial cognition. In two experiments, participants judged the relative azimuth (same/different) of two successive sounds presented from a horizontal array of loudspeakers, separated by a 2.5-s delay. Eye-position was either held constant throughout the trial (being directed in a fixed manner to the far left or right of the loudspeaker array) or had to be shifted to the opposite side of the array during the retention delay between the two sounds, after the first sound but before the second. Loudspeakers were either visible (Experiment 1) or occluded from sight (Experiment 2). In both cases, shifting eye-position during the silent delay-period affected auditory performance in thn the successive auditory comparison task, even though the auditory inputs to be judged were equivalent. Sensitivity (d') for the auditory discrimination was disrupted, specifically when the second sound shifted in the opposite direction to the intervening eye-movement with respect to the first sound. These results indicate that eye-movements affect internal representation of auditory location
    corecore