131 research outputs found

    Bodily resonance: Exploring the effects of virtual embodiment on pain modulation and the fostering of empathy toward pain sufferers

    Get PDF
    Globally, around 20% of people suffer from chronic pain, an illness that cannot be cured and has been linked to numerous physical and mental conditions. According to the BioPsychoSocial model of pain, chronic pain presents patients with biological, psychological, and social challenges and difficulties. Immersive virtual reality (VR) has shown great promise in helping people manage acute and chronic pain, and facilitating empathy of vulnerable populations. Therefore, the first research trajectory of this dissertation targets chronic pain patients’ biological and psychological sufferings to provide VR analgesia, and the second research trajectory targets healthy people to build empathy and reduce patients’ social stigma. Researchers have taken the attention distraction approach to study how acute pain patients can manage their condition in VR, while the virtual embodiment approach has mostly been studied with healthy people exposed to pain stimulus. My first research trajectory aimed to understand how embodied characteristics affect users’ sense of embodiment and pain. Three studies have been carried out with healthy people under heat pain, complex regional pain syndrome patients, and phantom limb pain patients. My findings indicate that for all three studies, when users see a healthy or intact virtual body or body parts, they experience significant reductions in their self-reported pain ratings. Additionally, I found that the appearance of a virtual body has a significant impact on pain, whereas the virtual body’s motions do not. Despite the prevalence of chronic pain, public awareness of it is remarkably low, and pain patients commonly experience social stigma. Thus, having an embodied perspective of chronic pain patients is critical to understand their social stigma. Although there is a growing interest in using embodied VR to foster empathy towards gender or racial bias, few studies have focused on people with chronic pain. My second trajectory explored how researchers can foster empathy towards pain patients in embodied VR. To conclude, this dissertation uncovers the role of VR embodiment and dissects embodied characteristics in pain modulation and empathy generation. Finally, I summarized a novel conceptual design framework for embodied VR applications with design recommendations and future research directions

    Brain (re)organisation following amputation:implications for phantom limb pain

    Get PDF
    Following arm amputation the region that represented the missing hand in primary somatosensory cortex (S1) becomes deprived of its primary input, resulting in changed boundaries of the S1 body map. This remapping process has been termed ‘reorganisation’ and has been attributed to multiple mechanisms, including increased expression of previously masked inputs. In a maladaptive plasticity model, such reorganisation has been associated with phantom limb pain (PLP). Brain activity associated with phantom hand movements is also correlated with PLP, suggesting that preserved limb functional representation may serve as a complementary process. Here we review some of the most recent evidence for the potential drivers and consequences of brain (re)organisation following amputation, based on human neuroimaging. We emphasise other perceptual and behavioural factors consequential to arm amputation, such as non-painful phantom sensations, perceived limb ownership, intact hand compensatory behaviour or prosthesis use, which have also been related to both cortical changes and PLP. We also discuss new findings based on interventions designed to alter the brain representation of the phantom limb, including augmented/virtual reality applications and brain computer interfaces. These studies point to a close interaction of sensory changes and alterations in brain regions involved in body representation, pain processing and motor control. Finally, we review recent evidence based on methodological advances such as high field neuroimaging and multivariate techniques that provide new opportunities to interrogate somatosensory representations in the missing hand cortical territory. Collectively, this research highlights the need to consider potential contributions of additional brain mechanisms, beyond S1 remapping, and the dynamic interplay of contextual factors with brain changes for understanding and alleviating PLP

    Novel Bidirectional Body - Machine Interface to Control Upper Limb Prosthesis

    Get PDF
    Objective. The journey of a bionic prosthetic user is characterized by the opportunities and limitations involved in adopting a device (the prosthesis) that should enable activities of daily living (ADL). Within this context, experiencing a bionic hand as a functional (and, possibly, embodied) limb constitutes the premise for mitigating the risk of its abandonment through the continuous use of the device. To achieve such a result, different aspects must be considered for making the artificial limb an effective support for carrying out ADLs. Among them, intuitive and robust control is fundamental to improving amputees’ quality of life using upper limb prostheses. Still, as artificial proprioception is essential to perceive the prosthesis movement without constant visual attention, a good control framework may not be enough to restore practical functionality to the limb. To overcome this, bidirectional communication between the user and the prosthesis has been recently introduced and is a requirement of utmost importance in developing prosthetic hands. Indeed, closing the control loop between the user and a prosthesis by providing artificial sensory feedback is a fundamental step towards the complete restoration of the lost sensory-motor functions. Within my PhD work, I proposed the development of a more controllable and sensitive human-like hand prosthesis, i.e., the Hannes prosthetic hand, to improve its usability and effectiveness. Approach. To achieve the objectives of this thesis work, I developed a modular and scalable software and firmware architecture to control the Hannes prosthetic multi-Degree of Freedom (DoF) system and to fit all users’ needs (hand aperture, wrist rotation, and wrist flexion in different combinations). On top of this, I developed several Pattern Recognition (PR) algorithms to translate electromyographic (EMG) activity into complex movements. However, stability and repeatability were still unmet requirements in multi-DoF upper limb systems; hence, I started by investigating different strategies to produce a more robust control. To do this, EMG signals were collected from trans-radial amputees using an array of up to six sensors placed over the skin. Secondly, I developed a vibrotactile system to implement haptic feedback to restore proprioception and create a bidirectional connection between the user and the prosthesis. Similarly, I implemented an object stiffness detection to restore tactile sensation able to connect the user with the external word. This closed-loop control between EMG and vibration feedback is essential to implementing a Bidirectional Body - Machine Interface to impact amputees’ daily life strongly. For each of these three activities: (i) implementation of robust pattern recognition control algorithms, (ii) restoration of proprioception, and (iii) restoration of the feeling of the grasped object's stiffness, I performed a study where data from healthy subjects and amputees was collected, in order to demonstrate the efficacy and usability of my implementations. In each study, I evaluated both the algorithms and the subjects’ ability to use the prosthesis by means of the F1Score parameter (offline) and the Target Achievement Control test-TAC (online). With this test, I analyzed the error rate, path efficiency, and time efficiency in completing different tasks. Main results. Among the several tested methods for Pattern Recognition, the Non-Linear Logistic Regression (NLR) resulted to be the best algorithm in terms of F1Score (99%, robustness), whereas the minimum number of electrodes needed for its functioning was determined to be 4 in the conducted offline analyses. Further, I demonstrated that its low computational burden allowed its implementation and integration on a microcontroller running at a sampling frequency of 300Hz (efficiency). Finally, the online implementation allowed the subject to simultaneously control the Hannes prosthesis DoFs, in a bioinspired and human-like way. In addition, I performed further tests with the same NLR-based control by endowing it with closed-loop proprioceptive feedback. In this scenario, the results achieved during the TAC test obtained an error rate of 15% and a path efficiency of 60% in experiments where no sources of information were available (no visual and no audio feedback). Such results demonstrated an improvement in the controllability of the system with an impact on user experience. Significance. The obtained results confirmed the hypothesis of improving robustness and efficiency of a prosthetic control thanks to of the implemented closed-loop approach. The bidirectional communication between the user and the prosthesis is capable to restore the loss of sensory functionality, with promising implications on direct translation in the clinical practice

    Missing in Action: Embodied Experience and Virtual Reality

    No full text
    This essay examines embodied experience in virtual reality (VR) theatre, performance art, and installations in one-to-one engagements with virtual worlds and in telematic interactions with other people. It proposes that bodies in VR are blurred, virtual and physical, absent and present, compounded and indivisible, even though body and environment have different materialities. This blurring can cause confusion in the ethics of embodiment that usually govern physical interactions between audience and performer—when, and if, to touch or be touched—since embodied experience confounds cognitive separation between the physical and virtual. Such confusion can result in a mismatch between the embodied self and disembodied Other that the gaming world is poorly equipped to negotiate, but that could have profound effects on VR users. Theatre, on the other hand, is well-versed in the negotiation of the real and the virtual, and virtual environments allow us to ask questions about embodiment and humanity through the experiences of individual bodies in ways that were never previously achievable. How can theatre and performance help us to understand the nature of embodied experience in VR when anything can be done but the body is apparently missing? It becomes possible to explore impossible situations and experiences through the eyes of others. Yet, is it ethically defensible to engage in any experience or action that would not be viable, or perhaps condoned, in the physical world on the basis that it is not “real”? The essay examines the nature of embodied experience in VR and considers the implications for theatre

    Biosignal‐based human–machine interfaces for assistance and rehabilitation : a survey

    Get PDF
    As a definition, Human–Machine Interface (HMI) enables a person to interact with a device. Starting from elementary equipment, the recent development of novel techniques and unobtrusive devices for biosignals monitoring paved the way for a new class of HMIs, which take such biosignals as inputs to control various applications. The current survey aims to review the large literature of the last two decades regarding biosignal‐based HMIs for assistance and rehabilitation to outline state‐of‐the‐art and identify emerging technologies and potential future research trends. PubMed and other databases were surveyed by using specific keywords. The found studies were further screened in three levels (title, abstract, full‐text), and eventually, 144 journal papers and 37 conference papers were included. Four macrocategories were considered to classify the different biosignals used for HMI control: biopotential, muscle mechanical motion, body motion, and their combinations (hybrid systems). The HMIs were also classified according to their target application by considering six categories: prosthetic control, robotic control, virtual reality control, gesture recognition, communication, and smart environment control. An ever‐growing number of publications has been observed over the last years. Most of the studies (about 67%) pertain to the assistive field, while 20% relate to rehabilitation and 13% to assistance and rehabilitation. A moderate increase can be observed in studies focusing on robotic control, prosthetic control, and gesture recognition in the last decade. In contrast, studies on the other targets experienced only a small increase. Biopotentials are no longer the leading control signals, and the use of muscle mechanical motion signals has experienced a considerable rise, especially in prosthetic control. Hybrid technologies are promising, as they could lead to higher performances. However, they also increase HMIs’ complex-ity, so their usefulness should be carefully evaluated for the specific application

    HoloPHAM: An Augmented Reality Training System For Upper Limb Myoelectric Prosthesis Users

    Get PDF
    From hook-shaped prosthetic devices to myoelectric prostheses with increased functional capabilities such as the Modular Prosthetic Limb (MPL), upper limb prostheses have come a long way. However, user acceptance rate does not show a similar increasing trend. Functional use training is incorporated into occupational therapy for myoelectric prosthesis users to bridge this gap. Advancements in technology for virtual and augmented reality enable the application of immersive virtual environments in prosthesis user training. Such training systems have been shown to result in higher user performance and participation in training exercises. The work presented here introduces the application of augmented reality (AR) in myoelectric prosthesis user training. This was done through the development of HoloPHAM, an AR training tool designed to mimic a real-world training protocol called Prosthetic Hand Assessment Measure (PHAM). This AR system was built for use with the Microsoft HoloLens, thus requiring a motion tracking system that could enable the user to move around freely in a room. The Bluetooth Orientation Tracking System (BOTS) was developed as an inertial measurement unit (IMU)-based wireless motion tracking system for this purpose. Performance of BOTS as a motion tracker was evaluated by comparison with the Microsoft Kinect sensor. Results showed that BOTS out-performed the Kinect sensor as a motion tracking system for our intended application in HoloPHAM. BOTS and the Myo armband were combined to form a human-machine interface (HMI) to control the virtual arm of HoloPHAM, enabling virtual object manipulation. This HMI along with the virtual PHAM set-up makes HoloPHAM a portable AR training environment that can be applied for prosthesis user training or evaluation of new myoelectric control strategies

    Multiscale body maps in the human brain

    Get PDF
    A large number of brain regions are dedicated to processing information from the body in order to enable interactions with the environment. During my thesis, I studied the functional organization of brain networks involved in processing bodily information. From the processing of unimodal low-level features to the unique experience of being a unified entity residing in a physical body, the brain processes and integrates bodily information at many different stages. Using ultra high-field functional Magnetic Resonance Imaging (fMRI), I conducted four studies to map and characterize multiscale body representations in the human brain. The goals of my thesis were first to extend the actual knowledge about primary sensorimotor representations, and second to develop novel approaches to investigate more complex and integrated forms of body representations. In studies I and II, I first investigated how natural touch was represented in the three first cortical areas processing tactile information. I applied a mapping procedure to identify in each of these three areas the somatosensory representations of 24 different body parts on hands, feet and legs at the level of single subjects. Using fMRI and resting-state data, I combined classical statistical analyses with modern methods of network analysis to describe the functional properties of the formed network. In study III, I applied these methods to investigate primary somatosensory and motor representations in a rare population of patients. Following limb loss, the targeted muscle and sensory reinnervation (TMSR) procedure enables the intuitive control of a myoelectric prosthesis and creates an artificial map of referred touch on the reinnervated skin. I mapped the primary somatosensory and motor representations of phantom sensations and phantom movements in TMSR patients. I investigated whether sensorimotor training enabled via TMSR was associated with preserved somatosensory and motor representations compared to healthy controls and amputee patients without TMSR. Finally in study IV, I studied brain regions involved in the subjective body experience. Following specific manipulations of sensorimotor information, it is possible to let participants experience a fake or virtual hand as their own and to give them the sensation of being in control of this hand. Using MR-compatible robotics and virtual reality, I investigated the brain regions associated with the alteration of the sense of hand ownership and the sense of hand agency. The present work provides important findings and promising tools regarding the understanding of brain networks processing bodily information. In particular, understanding the functional interactions between primary unimodal cortices and networks contributing to subjective body experience is a necessity to promote modern approaches in the fields of neuroprosthetic and human-machine interactions

    Neuromorphic hardware for somatosensory neuroprostheses

    Get PDF
    In individuals with sensory-motor impairments, missing limb functions can be restored using neuroprosthetic devices that directly interface with the nervous system. However, restoring the natural tactile experience through electrical neural stimulation requires complex encoding strategies. Indeed, they are presently limited in effectively conveying or restoring tactile sensations by bandwidth constraints. Neuromorphic technology, which mimics the natural behavior of neurons and synapses, holds promise for replicating the encoding of natural touch, potentially informing neurostimulation design. In this perspective, we propose that incorporating neuromorphic technologies into neuroprostheses could be an effective approach for developing more natural human-machine interfaces, potentially leading to advancements in device performance, acceptability, and embeddability. We also highlight ongoing challenges and the required actions to facilitate the future integration of these advanced technologies
    • 

    corecore