28 research outputs found

    Sensorimotor experience in virtual environments

    Get PDF
    The goal of rehabilitation is to reduce impairment and provide functional improvements resulting in quality participation in activities of life, Plasticity and motor learning principles provide inspiration for therapeutic interventions including movement repetition in a virtual reality environment, The objective of this research work was to investigate functional specific measurements (kinematic, behavioral) and neural correlates of motor experience of hand gesture activities in virtual environments stimulating sensory experience (VE) using a hand agent model. The fMRI compatible Virtual Environment Sign Language Instruction (VESLI) System was designed and developed to provide a number of rehabilitation and measurement features, to identify optimal learning conditions for individuals and to track changes in performance over time. Therapies and measurements incorporated into VESLI target and track specific impairments underlying dysfunction. The goal of improved measurement is to develop targeted interventions embedded in higher level tasks and to accurately track specific gains to understand the responses to treatment, and the impact the response may have upon higher level function such as participation in life. To further clarify the biological model of motor experiences and to understand the added value and role of virtual sensory stimulation and feedback which includes seeing one\u27s own hand movement, functional brain mapping was conducted with simultaneous kinematic analysis in healthy controls and in stroke subjects. It is believed that through the understanding of these neural activations, rehabilitation strategies advantaging the principles of plasticity and motor learning will become possible. The present research assessed successful practice conditions promoting gesture learning behavior in the individual. For the first time, functional imaging experiments mapped neural correlates of human interactions with complex virtual reality hands avatars moving synchronously with the subject\u27s own hands, Findings indicate that healthy control subjects learned intransitive gestures in virtual environments using the first and third person avatars, picture and text definitions, and while viewing visual feedback of their own hands, virtual hands avatars, and in the control condition, hidden hands. Moreover, exercise in a virtual environment with a first person avatar of hands recruited insular cortex activation over time, which might indicate that this activation has been associated with a sense of agency. Sensory augmentation in virtual environments modulated activations of important brain regions associated with action observation and action execution. Quality of the visual feedback was modulated and brain areas were identified where the amount of brain activation was positively or negatively correlated with the visual feedback, When subjects moved the right hand and saw unexpected response, the left virtual avatar hand moved, neural activation increased in the motor cortex ipsilateral to the moving hand This visual modulation might provide a helpful rehabilitation therapy for people with paralysis of the limb through visual augmentation of skills. A model was developed to study the effects of sensorimotor experience in virtual environments, and findings of the effect of sensorimotor experience in virtual environments upon brain activity and related behavioral measures. The research model represents a significant contribution to neuroscience research, and translational engineering practice, A model of neural activations correlated with kinematics and behavior can profoundly influence the delivery of rehabilitative services in the coming years by giving clinicians a framework for engaging patients in a sensorimotor environment that can optimally facilitate neural reorganization

    VALIDATION OF A MODEL OF SENSORIMOTOR INTEGRATION WITH CLINICAL BENEFITS

    Get PDF
    Healthy sensorimotor integration – or how our touch influences our movements – is critical to efficiently interact with our environment. Yet, many aspects of this process are still poorly understood. Importantly, several movement disorders are often considered as originating from purely motor impairments, while a sensory origin could also lead to a similar set of symptoms. To alleviate these issues, we hereby propose a novel biologically-based model of the sensorimotor loop, known as the SMILE model. After describing both the functional, and the corresponding neuroanatomical versions of the SMILE, we tested several aspects of its motor component through functional magnetic resonance imaging (fMRI) and transcranial magnetic stimulation (TMS). Both experimental studies resulted in coherent outcomes with respect to the SMILE predictions, but they also provided novel scientific outcomes about such broad topics as the sub-phases of motor imagery, the neural processing of bodily representations, or the extend of the role of the extrastriate body area. In the final sections of this manuscript, we describe some potential clinical application of the SMILE. The first one presents the identification of plausible neuroanatomical origins for focal hand dystonia, a yet poorly understood sensorimotor disorder. The last chapter then covers possible improvements on brain-machine interfaces, driven by a better understanding of the sensorimotor system. -- La façon dont votre sens du toucher et vos mouvements interagissent est connue sous le nom d’intégration sensorimotrice. Ce procédé est essentiel pour une interaction normale avec tout ce qui nous entoure. Cependant, plusieurs aspects de ce processus sont encore méconnus. Plus important encore, l’origine de certaines déficiences motrices encore trop peu comprises sont parfois considérées comme purement motrice, alors qu’une origine sensorielle pourrait mener à un même ensemble de symptômes. Afin d’améliorer cette situation, nous proposons ici un nouveau modèle d’intégration sensorimotrice, dénommé « SMILE », basé sur les connaissances de neurobiologie actuelles. Dans ce manuscrit, nous commençons par décrire les caractéristiques fonctionnelles et neuroanatomiques du SMILE. Plusieurs expériences sont ensuite effectuées, via l’imagerie par résonance magnétique fonctionnelle (IRMf), et la stimulation magnétique transcranienne (SMT), afin de tester différents aspects de la composante motrice du SMILE. Si les résultats de ces expériences corroborent les prédictions du SMILE, elles ont aussi mis en évidences d’autres résultats scientifiques intéressants et novateurs, dans des domaines aussi divers que les sous-phases de l’imagination motrice, les processus cérébraux liés aux représentations corporelles, ou encore l’extension du rôle de l’extrastriate body area. Dans les dernières parties de ce manuscrit, nous dévoilons quelques applications cliniques potentielles de notre modèle. Nous utilisons le SMILE afin de proposer deux origines cérébrales plausibles de la dystonie focale de la main. Le dernier chapitre présente comment certaines technologies existantes, telles que les interfaces cerveaux-machines, pourraient bénéficier d’une meilleure compréhension du système sensorimoteur

    The vestibular body: vestibular contributions to bodily representations

    Get PDF
    Vestibular signals are integrated with signals from other sensory modalities. This convergence could reflect an important mechanism for maintaining the perception of the body. Here we review the current literature in order to develop a framework for understanding how the vestibular system contributes to body representation. According to recent models, we distinguish between three processes for body representation, and we look at whether vestibular signals might influence each process. These are (i) somatosensation, the primary sensory processing of somatic stimuli, (ii) somatoperception, the processes of constructing percepts and experiences of somatic objects and events and (iii) somatorepresentation, the knowledge about the body as a physical object in the world. Vestibular signals appear to contribute to all three levels in this model of body processing. Thus, the traditional view of the vestibular system as a low-level, dedicated orienting module tends to underestimate the pervasive role of vestibular input in bodily self-awareness

    Cognitive social and affective neuroscience of patients with spinal cord injury

    Get PDF
    A successful human-environment interaction requires a continuous integration of information concerning body parts, object features and affective dynamics. Multiple neuropsychological studies show that tools can be integrated into the representation of one's own body. In particular, a tool that participates in the conscious movement of the person is added to the dynamic representation the body – often called “Body schema” – and may even affect social interaction. In light of this the wheelchair is treated as an extension of the disabled body, essentially replacing limbs that don't function properly, but it can also be a symbol of frailty and weakness. In a series of experiments, I studied plastic changes of action, tool and body representation in individuals with spinal cord injury (SCI). Due to their peripheral loss of sensorimotor functions, in the absence of brain lesions and spared higher order cognitive functions, these patients represent an excellent model to study this topic in a multi-faceted way, investigating both fundamental mechanisms and possible therapeutic interventions. In a series of experiments, I developed new behavioral methods to measure the phenomenological aspects of tool embodiment (Chapter 3), to study its functional and neural correlates (Chapter 4) and to assess the possible computational model underpinning these phenomena (Chapter 5). These tasks have been used to describe changes in tool, action and body representation following the injury (Chapter 3 and 4), but also social interactions (Chapter 7), with the aim of giving a complete portrait of change following such damage. I found that changes in the function (wheelchair use) and the structure (body brain disconnection) of the physical body, plastically modulate tool, action and body representation. Social context and social interaction are also shaped by the new configuration of bodily representations. Such a high degree of plasticity suggests that our sense of body is not given at once, but rather it is constantly constructed and adapted through experience

    Multiscale body maps in the human brain

    Get PDF
    A large number of brain regions are dedicated to processing information from the body in order to enable interactions with the environment. During my thesis, I studied the functional organization of brain networks involved in processing bodily information. From the processing of unimodal low-level features to the unique experience of being a unified entity residing in a physical body, the brain processes and integrates bodily information at many different stages. Using ultra high-field functional Magnetic Resonance Imaging (fMRI), I conducted four studies to map and characterize multiscale body representations in the human brain. The goals of my thesis were first to extend the actual knowledge about primary sensorimotor representations, and second to develop novel approaches to investigate more complex and integrated forms of body representations. In studies I and II, I first investigated how natural touch was represented in the three first cortical areas processing tactile information. I applied a mapping procedure to identify in each of these three areas the somatosensory representations of 24 different body parts on hands, feet and legs at the level of single subjects. Using fMRI and resting-state data, I combined classical statistical analyses with modern methods of network analysis to describe the functional properties of the formed network. In study III, I applied these methods to investigate primary somatosensory and motor representations in a rare population of patients. Following limb loss, the targeted muscle and sensory reinnervation (TMSR) procedure enables the intuitive control of a myoelectric prosthesis and creates an artificial map of referred touch on the reinnervated skin. I mapped the primary somatosensory and motor representations of phantom sensations and phantom movements in TMSR patients. I investigated whether sensorimotor training enabled via TMSR was associated with preserved somatosensory and motor representations compared to healthy controls and amputee patients without TMSR. Finally in study IV, I studied brain regions involved in the subjective body experience. Following specific manipulations of sensorimotor information, it is possible to let participants experience a fake or virtual hand as their own and to give them the sensation of being in control of this hand. Using MR-compatible robotics and virtual reality, I investigated the brain regions associated with the alteration of the sense of hand ownership and the sense of hand agency. The present work provides important findings and promising tools regarding the understanding of brain networks processing bodily information. In particular, understanding the functional interactions between primary unimodal cortices and networks contributing to subjective body experience is a necessity to promote modern approaches in the fields of neuroprosthetic and human-machine interactions

    Bodily resonance: Exploring the effects of virtual embodiment on pain modulation and the fostering of empathy toward pain sufferers

    Get PDF
    Globally, around 20% of people suffer from chronic pain, an illness that cannot be cured and has been linked to numerous physical and mental conditions. According to the BioPsychoSocial model of pain, chronic pain presents patients with biological, psychological, and social challenges and difficulties. Immersive virtual reality (VR) has shown great promise in helping people manage acute and chronic pain, and facilitating empathy of vulnerable populations. Therefore, the first research trajectory of this dissertation targets chronic pain patients’ biological and psychological sufferings to provide VR analgesia, and the second research trajectory targets healthy people to build empathy and reduce patients’ social stigma. Researchers have taken the attention distraction approach to study how acute pain patients can manage their condition in VR, while the virtual embodiment approach has mostly been studied with healthy people exposed to pain stimulus. My first research trajectory aimed to understand how embodied characteristics affect users’ sense of embodiment and pain. Three studies have been carried out with healthy people under heat pain, complex regional pain syndrome patients, and phantom limb pain patients. My findings indicate that for all three studies, when users see a healthy or intact virtual body or body parts, they experience significant reductions in their self-reported pain ratings. Additionally, I found that the appearance of a virtual body has a significant impact on pain, whereas the virtual body’s motions do not. Despite the prevalence of chronic pain, public awareness of it is remarkably low, and pain patients commonly experience social stigma. Thus, having an embodied perspective of chronic pain patients is critical to understand their social stigma. Although there is a growing interest in using embodied VR to foster empathy towards gender or racial bias, few studies have focused on people with chronic pain. My second trajectory explored how researchers can foster empathy towards pain patients in embodied VR. To conclude, this dissertation uncovers the role of VR embodiment and dissects embodied characteristics in pain modulation and empathy generation. Finally, I summarized a novel conceptual design framework for embodied VR applications with design recommendations and future research directions

    Perceptual abnormalities in amputees: phantom pain, mirror-touch synaesthesia and referred tactile sensations

    Get PDF
    It is often reported that after amputation people experience "a constant or inconstant... sensory ghost... faintly felt at time, but ready to be called up to [their] perception" (Mitchell, 1866). Perceptual abnormalities have been highlighted in amputees, such as sensations in the phantom when being stroked elsewhere (Ramachandran et al., 1992) or when observing someone in pain (Giummarra and Bradshaw, 2008). This thesis explored the perceptual changes that occur following amputation whist focusing on pain, vision and touch. A sample of over 100 amputees were recruited through the National Health Service. Despite finding no difference in phantom pain based on physical amputation details or nonpainful perceptual phenomena, results from Paper 1 indicated that phantom pain may be more intense, with sensations occurring more frequently, in amputees whose pain was triggerinduced. The survey in Paper 2 identified a group of amputees who in losing a limb acquired mirror-touch synaesthesia. Higher levels of empathy found in mirror-touch amputees might mean that some people are predisposed to develop synaesthesia, but that it takes sensory loss to bring dormant cross-sensory interactions into consciousness. Although the mirror-system may reach supra-threshold levels in some amputees, the experiments in Paper 3 suggested a relatively intact mirror-system in amputees overall. Specifically, in a task of apparent biological motion, amputees showed a similar, although weaker, pattern of results to normalbodied participants. The results of Paper 4 showed that tactile spatial acuity on the face was also largely not affected by amputation, as no difference was found between the sides ipsilateral and contralateral to the stump. In Paper 5 cross-modal cuing was used to investigate whether referred tactile sensations could prime a visually presented target in space occupied by the phantom limb. We conclude that perception is only moderately affected in most amputees, but that in some the sensory loss causes normally sub-threshold processing to enhance into conscious awareness

    The Neural Correlates of Bodily Self-Consciousness in Virtual Worlds

    Full text link
    Bodily Self-Consciousness (BSC) is the cumulative integration of multiple sensory modalities that contribute to our sense of self. Sensory modalities, which include proprioception, vestibulation, vision, and touch are updated dynamically to map the specific, local representation of ourselves in space. BSC is closely associated with bottom-up and top-down aspects of consciousness. Recently, virtual- and augmented-reality technology have been used to explore perceptions of BSC. These recent achievements are partly attributed to advances in modern technology, and partly due to the rise of virtual and augmented reality markets. Virtual reality head-mounted displays can alter aspects of perception and consciousness unlike ever before. Consequently, many strides have been made regarding BSC research. Previous research suggests that BSC results from the perceptions of embodiment (i.e., the feeling of ownership towards a real or virtual extremity) and presence (i.e., feeling physically located in a real or virtual space). Though physiological mechanisms serving embodiment and presence in the real world have been proposed by others, how these perceptual experiences interact and whether they can be dissociated is still poorly understood. Additionally, less is known about the physiological mechanisms underlying the perception of presence and embodiment in virtual environments. Therefore, five experiments were conducted to examine the perceptions of embodiment and presence in virtual environments to determine which physiological mechanisms support these perceptions. These studies compared performance between normal or altered embodiment/presence conditions. Results from a novel experimental paradigm using virtual reality (Experiment 4) are consistent with studies in the literature that reported synchronous sensorimotor feedback corresponded with greater strength of the embodiment illusion. In Experiment 4, participants recorded significantly faster reaction times and better accuracy in correlated feedback conditions compared to asynchronous feedback conditions. Reaction times were also significantly faster, and accuracy was higher for conditions where participants experienced the game from a first- versus third-person perspective. Functional magnetic resonance imaging (fMRI) data from Experiment 5 revealed that many frontoparietal networks contribute to the perception of embodiment, which include premotor cortex (PMC) and intraparietal sulcus (IPS). fMRI data revealed that activity in temporoparietal networks, including the temporoparietal junction and right precuneus, corresponded with manipulations thought to affect the perception of presence. Furthermore, data suggest that networks associated with embodiment and presence overlap, and brain areas that support perception may be predicated upon those that support embodiment. The results of these experiments offer further clues into the psychophysiological mechanisms underlying BSC

    Closed-loop prosthetic hand : understanding sensorimotor and multisensory integration under uncertainty.

    Get PDF
    To make sense of our unpredictable world, humans use sensory information streaming through billions of peripheral neurons. Uncertainty and ambiguity plague each sensory stream, yet remarkably our perception of the world is seamless, robust and often optimal in the sense of minimising perceptual variability. Moreover, humans have a remarkable capacity for dexterous manipulation. Initiation of precise motor actions under uncertainty requires awareness of not only the statistics of our environment but also the reliability of our sensory and motor apparatus. What happens when our sensory and motor systems are disrupted? Upper-limb amputees tted with a state-of-the-art prostheses must learn to both control and make sense of their robotic replacement limb. Tactile feedback is not a standard feature of these open-loop limbs, fundamentally limiting the degree of rehabilitation. This thesis introduces a modular closed-loop upper-limb prosthesis, a modified Touch Bionics ilimb hand with a custom-built linear vibrotactile feedback array. To understand the utility of the feedback system in the presence of multisensory and sensorimotor influences, three fundamental open questions were addressed: (i) What are the mechanisms by which subjects compute sensory uncertainty? (ii) Do subjects integrate an artificial modality with visual feedback as a function of sensory uncertainty? (iii) What are the influences of open-loop and closed-loop uncertainty on prosthesis control? To optimally handle uncertainty in the environment people must acquire estimates of the mean and uncertainty of sensory cues over time. A novel visual tracking experiment was developed in order to explore the processes by which people acquire these statistical estimators. Subjects were required to simultaneously report their evolving estimate of the mean and uncertainty of visual stimuli over time. This revealed that subjects could accumulate noisy evidence over the course of a trial to form an optimal continuous estimate of the mean, hindered only by natural kinematic constraints. Although subjects had explicit access to a measure of their continuous objective uncertainty, acquired from sensory information available within a trial, this was limited by a conservative margin for error. In the Bayesian framework, sensory evidence (from multiple sensory cues) and prior beliefs (knowledge of the statistics of sensory cues) are combined to form a posterior estimate of the state of the world. Multiple studies have revealed that humans behave as optimal Bayesian observers when making binary decisions in forced-choice tasks. In this thesis these results were extended to a continuous spatial localisation task. Subjects could rapidly accumulate evidence presented via vibrotactile feedback (an artificial modality ), and integrate it with visual feedback. The weight attributed to each sensory modality was chosen so as to minimise the overall objective uncertainty. Since subjects were able to combine multiple sources of sensory information with respect to their sensory uncertainties, it was hypothesised that vibrotactile feedback would benefit prosthesis wearers in the presence of either sensory or motor uncertainty. The closed-loop prosthesis served as a novel manipulandum to examine the role of feed-forward and feed-back mechanisms for prosthesis control, known to be required for successful object manipulation in healthy humans. Subjects formed economical grasps in idealised (noise-free) conditions and this was maintained even when visual, tactile and both sources of feedback were removed. However, when uncertainty was introduced into the hand controller, performance degraded significantly in the absence of visual or tactile feedback. These results reveal the complementary nature of feed-forward and feed-back processes in simulated prosthesis wearers, and highlight the importance of tactile feedback for control of a prosthesis

    Novel Bidirectional Body - Machine Interface to Control Upper Limb Prosthesis

    Get PDF
    Objective. The journey of a bionic prosthetic user is characterized by the opportunities and limitations involved in adopting a device (the prosthesis) that should enable activities of daily living (ADL). Within this context, experiencing a bionic hand as a functional (and, possibly, embodied) limb constitutes the premise for mitigating the risk of its abandonment through the continuous use of the device. To achieve such a result, different aspects must be considered for making the artificial limb an effective support for carrying out ADLs. Among them, intuitive and robust control is fundamental to improving amputees’ quality of life using upper limb prostheses. Still, as artificial proprioception is essential to perceive the prosthesis movement without constant visual attention, a good control framework may not be enough to restore practical functionality to the limb. To overcome this, bidirectional communication between the user and the prosthesis has been recently introduced and is a requirement of utmost importance in developing prosthetic hands. Indeed, closing the control loop between the user and a prosthesis by providing artificial sensory feedback is a fundamental step towards the complete restoration of the lost sensory-motor functions. Within my PhD work, I proposed the development of a more controllable and sensitive human-like hand prosthesis, i.e., the Hannes prosthetic hand, to improve its usability and effectiveness. Approach. To achieve the objectives of this thesis work, I developed a modular and scalable software and firmware architecture to control the Hannes prosthetic multi-Degree of Freedom (DoF) system and to fit all users’ needs (hand aperture, wrist rotation, and wrist flexion in different combinations). On top of this, I developed several Pattern Recognition (PR) algorithms to translate electromyographic (EMG) activity into complex movements. However, stability and repeatability were still unmet requirements in multi-DoF upper limb systems; hence, I started by investigating different strategies to produce a more robust control. To do this, EMG signals were collected from trans-radial amputees using an array of up to six sensors placed over the skin. Secondly, I developed a vibrotactile system to implement haptic feedback to restore proprioception and create a bidirectional connection between the user and the prosthesis. Similarly, I implemented an object stiffness detection to restore tactile sensation able to connect the user with the external word. This closed-loop control between EMG and vibration feedback is essential to implementing a Bidirectional Body - Machine Interface to impact amputees’ daily life strongly. For each of these three activities: (i) implementation of robust pattern recognition control algorithms, (ii) restoration of proprioception, and (iii) restoration of the feeling of the grasped object's stiffness, I performed a study where data from healthy subjects and amputees was collected, in order to demonstrate the efficacy and usability of my implementations. In each study, I evaluated both the algorithms and the subjects’ ability to use the prosthesis by means of the F1Score parameter (offline) and the Target Achievement Control test-TAC (online). With this test, I analyzed the error rate, path efficiency, and time efficiency in completing different tasks. Main results. Among the several tested methods for Pattern Recognition, the Non-Linear Logistic Regression (NLR) resulted to be the best algorithm in terms of F1Score (99%, robustness), whereas the minimum number of electrodes needed for its functioning was determined to be 4 in the conducted offline analyses. Further, I demonstrated that its low computational burden allowed its implementation and integration on a microcontroller running at a sampling frequency of 300Hz (efficiency). Finally, the online implementation allowed the subject to simultaneously control the Hannes prosthesis DoFs, in a bioinspired and human-like way. In addition, I performed further tests with the same NLR-based control by endowing it with closed-loop proprioceptive feedback. In this scenario, the results achieved during the TAC test obtained an error rate of 15% and a path efficiency of 60% in experiments where no sources of information were available (no visual and no audio feedback). Such results demonstrated an improvement in the controllability of the system with an impact on user experience. Significance. The obtained results confirmed the hypothesis of improving robustness and efficiency of a prosthetic control thanks to of the implemented closed-loop approach. The bidirectional communication between the user and the prosthesis is capable to restore the loss of sensory functionality, with promising implications on direct translation in the clinical practice
    corecore