7 research outputs found
Neural correlates for task-relevant facilitation of visual inputs during visually-guided hand movements
International audienceno abstrac
Reaching and grasping actions and their context shape the perception of object size
Humans frequently estimate the size of objects to grasp them. In fact, when performing an action, our perception is focused towards the visual properties of the object that enable us to successfully execute the action. However, the motor system is also able to influence perception, but only a few studies have reported evidence for action-induced visual perception modifications. Here, we aimed to look for a feature-specific perceptual modulation before and after a reaching or a grasping action. Human participants were instructed to either reach for or grasp two-dimensional bars of different size and to perform a size perceptual task before and after the action in two contexts: in one where they knew the subsequent type of movement and in the other where they did not know. We found significant modifications of perceived size of stimuli more pronounced after grasping than after reaching. The mere knowledge of the subsequent action type significantly affected the size perception before the movement execution, with consistent results in both manual and verbal reports. These data represent direct evidence that, in natural conditions without manipulation of visual information, the action type and the action context dynamically modulate size perception, by shaping it according to relevant information required to recognize and interact with objects
Atténuation des réafférences visuelles de la main dans le cortex pariétal lors d’un mouvement d’atteinte vers une cible.
Résumé : Il semble que le cerveau atténuerait les stimuli provenant de nos actions par rapport aux stimuli d’origine externe, ceci afin d’augmenter la pertinence des informations environnementales. Lors de la production d’un mouvement, la copie de la commande motrice serait utilisée pour anticiper les conséquences sensorielles. Les conséquences sensorielles prédites permettraient ainsi d’atténuer les réafférences réelles du mouvement. Plusieurs évidences montrent que l’activité corticale liée aux réafférences somatosensorielles et auditives est atténuée lorsque celles-ci résultent de nos propres mouvements par rapport à lorsqu’elles proviennent d’une cause externe. L’étude présentée dans ce mémoire a investigué l’existence d’une atténuation des réafférences visuelles lors d’un mouvement d’atteinte du bras vers une cible. L’expérience consistait en une tâche de pointage de la main vers une cible visuelle, pendant laquelle l’activité EEG des sujets était enregistrée. L’intervalle de temps entre la position réelle de la main et le curseur visuel associé à celle-ci était manipulé. De fait, le retour visuel était fourni soit en temps réel soit avec un retard de 150 ms. Le délai créait ainsi un décalage entre les conséquences visuelles prédites et réelles du mouvement. Les résultats montrent que l’amplitude de la composante N1 du Potentiel Évoqué Visuel (PEV) associé au retour visuel de la main était réduite dans le cortex pariétal lorsque le retour visuel était fourni en temps réel par rapport à lorsqu’il était présenté en retard. Conséquemment, ces données suggèrent que les réafférences visuelles du membre en mouvement sont atténuées au niveau cortical lorsqu’elles correspondent aux prédictions. L’association des résultats comportementaux et électrophysiologiques supportent également d’autres études qui montrent que les processus sensorimoteurs sont plus fins que la perception consciente. À la lumière de la littérature, la modulation de la composante N1 enregistrée aux électrodes pariéto-occipitales suggère l’implication des régions pariétales dans l’intégration des retours sensoriels et des prédictions motrices. En discussion, nous proposons que les retours visuels liés au contrôle en ligne du membre soient modulés au niveau pariétal en raison des prédictions motrices cérébelleuses, à l’instar des retours tactiles.Abstract : It is well established that the cortical processing of somatosensory and auditory signals is attenuated when they result from self-generated actions as compared to external events. This phenomenon is thought to result from an efference copy of motor commands used to predict the sensory consequences of an action through a forward model. The present work examined whether attenuation also takes place for visual reafferent signals from the moving limb during voluntary reaching movements. To address this issue, EEG activity was recorded in a condition in which visual feedback of the hand was provided in real time and compared to a condition in which it was presented with a 150 ms delay, thus creating a mismatch between the predicted and actual visual consequences of the movement. Results revealed that the amplitude of the N1 component of the visual ERP evoked by hand visual feedback over the parietal cortex was significantly smaller when presented in real time as compared to when it was delayed. These data suggest that the cortical processing of visual reafferent signals is attenuated when they are correctly predicted, likely as a result of a forward model
Distinguishing Vigilance Decrement and Low Task Demands from Mind-wandering:A Machine Learning Analysis of EEG
Mind-wandering is a ubiquitous mental phenomenon that is defined as self-generated thought irrelevant to the ongoing task. Mind-wandering tends to occur when people are in a low-vigilance state or when they are performing a very easy task. In the current study, we investigated whether mind-wandering is completely dependent on vigilance and current task demands, or whether it is an independent phenomenon. To this end, we trained support vector machine (SVM) classifiers on EEG data in conditions of low and high vigilance, as well as under conditions of low and high task demands, and subsequently tested those classifiers on participants' self-reported mind-wandering. Participants' momentary mental state was measured by means of intermittent thought probes in which they reported on their current mental state. The results showed that neither the vigilance classifier nor the task demands classifier could predict mind-wandering above-chance level, while a classifier trained on self-reports of mind-wandering was able to do so. This suggests that mind-wandering is a mental state different from low vigilance or performing tasks with low demands—both which could be discriminated from the EEG above chance. Furthermore, we used dipole fitting to source-localize the neural correlates of the most import features in each of the three classifiers, indeed finding a few distinct neural structures between the three phenomena. Our study demonstrates the value of machine-learning classifiers in unveiling patterns in neural data and uncovering the associated neural structures by combining it with an EEG source analysis technique
Eye-Hand Coordination Varies According to Changes in Cognitive-Motor Load and Eye Movements Used
In this dissertation three studies were used to help improve the understanding of eye- hand coordination control of visuomotor reaching tasks with varying cognitive loads. Specifically, we considered potential performance differences based on eye-movements, postural influences, as well as fitness level of the young adult participants. A brief introduction in chapter 1 is followed by a detailed literature review in chapter 2. Results from the three studies presented in chapter’s 3-5 further advance our knowledge of the integrated control used for goal-directed visually-guided reaches. In the first study (chapter 3), the additional cost associated with the use of smooth pursuit slowed hand movement speed when the eyes and hand moved in distinct directions, yet improved accuracy over the use of saccadic eye movements and eye fixation. We concluded that eye-movement choice can influence various types of visually-guided reaching with different cognitive demands and that researchers should provide clear eye-movement instructions for participants and/or monitor the eyes when assessing similar upper limb control to account for possible differences. In the second study (chapter 4), results revealed slower speed and poor accuracy of hand movements along with less body sway for visually-guided reaching when the eyes and hand moved in opposite directions during eye-hand decoupling compared to when the eyes and hand moved in the same direction (eye-hand coupling). In contrast, standing up did not significantly influence reaching performance compared to sitting. We concluded that increases in cognitive demands for eye-hand coordination created a greater need for postural control to help improve the goal- directed control of reaching. In the third study (chapter 5), we found no evidence of eye-hand coordination differences between highly fit or sedentary participants, yet cerebral activation in the centro-parietal location differed between tasks involving eye-hand coupling/decoupling. We concluded that reaching performance declines accompanied increased sensorimotor demands during eye-hand decoupling that may link to prior/current athletic experience and not fitness level. Overall, alterations in visually-guided goal-directed reaching movements involving eye-hand coupling and decoupling depend on changes in eye-movements utilized and not on low threat postural changes or fitness levels of the young adults performing the task
Recommended from our members
A multimodal imaging perspective on human sensorimotor behavior
Understanding motor control has been critical to motor rehabilitation after brain injuries. Neural activity can be detected non-invasively using functional magnetic resonance imaging (fMRI) that measures hemodynamic response and magnetoencephalography (MEG) that measures electrophysiological dynamics. In this dissertation, two scientific questions were investigated with two distinct functional neuroimaging techniques. First, I used fMRI to search for neural correlates of spasticity in individuals with chronic stroke. Spasticity, defined as velocity-dependent resistance to passive stretch, is common after stroke and imposes significant therapeutic challenges. It is believed that disinhibition of brainstem nuclei, possibly the lateral vestibular nuclei or pontine reticular formation, are primarily involved. As such, I aimed to localize the activity of these individual brainstem nuclei via 3T functional magnetic resonance imaging (fMRI) in a cohort of chronic stroke patients and healthy controls. Using both acoustic and visual stimuli to activate the brainstem without inducing motion in the participants. The results showed that the response of stroke patients was dominantly more correlated to age, duration of a stroke, and total brainstem volume. Another significant motor deficit that stroke patients face is the loss of individual finger control that allows fine motor control like precision pinch. In the second part of my dissertation, I investigated neural correlates of dynamic precision grip tasks, a predictor of sensorimotor impairment or decline. Visuomotor control for precision grip relies on an extensive cortical network for which research has traditionally focused on frontal and parietal regions subserving executive and visuomotor integration functions, respectively. However, the temporal dynamics of how visuomotor integration is expressed in the form of oscillatory modulation as a combination of both low and high-level functions remain unclear. Thus, I used MEG to measure dynamic oscillatory activity in the sensorimotor and visual areas to investigate their contribution to performance in a dynamic precision grip task in healthy individuals. A custom MEG-compatible sensor measured forefinger and thumb forces separately, which controlled the position of a cursor on the screen. My findings suggest that cortical oscillations in both sensorimotor and visual areas can dissociate task and movement parameters during dynamic pinch tasks and that they may share a common network for visuomotor control. My ultimate goal is to better address how the spatial and temporal profiles of neural activity connect behavior to pathologic responses.Biomedical Engineerin