research

Neuronal correlates of continuous manual tracking under varying visual movement feedback in a virtual reality environment

Abstract

To accurately guide one's actions online, the brain predicts sensory action feedback ahead of time based on internal models, which can be updated by sensory prediction errors. The underlying operations can be experimentally investigated in sensorimotor adaptation tasks, in which moving under perturbed sensory action feedback requires internal model updates. Here we altered healthy participants’ visual hand movement feedback in a virtual reality setup, while assessing brain activity with functional magnetic resonance imaging (fMRI). Participants tracked a continually moving virtual target object with a photorealistic, three-dimensional (3D) virtual hand controlled online via a data glove. During the continuous tracking task, the virtual hand's movements (i.e., visual movement feedback) were repeatedly periodically delayed, which participants had to compensate for to maintain accurate tracking. This realistic task design allowed us to simultaneously investigate processes likely operating at several levels of the brain's motor control hierarchy. FMRI revealed that the length of visual feedback delay was parametrically reflected by activity in the inferior parietal cortex and posterior temporal cortex. Unpredicted changes in visuomotor mapping (at transitions from synchronous to delayed visual feedback periods or vice versa) activated biological motion-sensitive regions in the lateral occipitotemporal cortex (LOTC). Activity in the posterior parietal cortex (PPC), focused on the contralateral anterior intraparietal sulcus (aIPS), correlated with tracking error, whereby this correlation was stronger in participants with higher tracking performance. Our results are in line with recent proposals of a wide- spread cortical motor control hierarchy, where temporoparietal regions seem to evaluate visuomotor congruence and thus possibly ground a self-attribution of movements, the LOTC likely processes early visual prediction errors, and the aIPS computes action goal errors and possibly corresponding motor corrections

    Similar works