Humans commonly use their hands to move and to interact with their environment by processing visual and proprioceptive information to determine the location of a goal-object and the initial hand position. It remains elusive, however, how the human brain fully uses this sensory information to generate accurate movements. In monkeys, it appears that frontal and parietal areas use and combine gaze and hand signals to generate movements, whereas in humans, prior work has separately assessed how the brain uses these two signals. Here we investigated whether and how the human brain integrates gaze orientation and hand position during simple visually triggered finger tapping. We hypothesized that parietal, frontal, and subcortical regions involved in movement production would also exhibit modulation of movement-related activation as a function of gaze and hand positions. We used functional MRI to measure brain activation while healthy young adults performed a visually cued finger movement and fixed gaze at each of three locations and held the arm in two different configurations. We found several areas that exhibited activation related to a mixture of these hand and gaze positions; these included the sensory-motor cortex, supramarginal gyrus, superior parietal lobule, superior frontal gyrus, anterior cingulate, and left cerebellum. We also found regions within the left insula, left cuneus, left midcingulate gyrus, left putamen, and right tempo-occipital junction with activation driven only by gaze orientation. Finally, clusters with hand position effects were found in the cerebellum bilaterally. Our results indicate that these areas integrate at least two signals to perform visual-motor actions and that these could be used to subserve sensory-motor transformations
To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.