112 research outputs found

    Sensorimotor experience in virtual environments

    Get PDF
    The goal of rehabilitation is to reduce impairment and provide functional improvements resulting in quality participation in activities of life, Plasticity and motor learning principles provide inspiration for therapeutic interventions including movement repetition in a virtual reality environment, The objective of this research work was to investigate functional specific measurements (kinematic, behavioral) and neural correlates of motor experience of hand gesture activities in virtual environments stimulating sensory experience (VE) using a hand agent model. The fMRI compatible Virtual Environment Sign Language Instruction (VESLI) System was designed and developed to provide a number of rehabilitation and measurement features, to identify optimal learning conditions for individuals and to track changes in performance over time. Therapies and measurements incorporated into VESLI target and track specific impairments underlying dysfunction. The goal of improved measurement is to develop targeted interventions embedded in higher level tasks and to accurately track specific gains to understand the responses to treatment, and the impact the response may have upon higher level function such as participation in life. To further clarify the biological model of motor experiences and to understand the added value and role of virtual sensory stimulation and feedback which includes seeing one\u27s own hand movement, functional brain mapping was conducted with simultaneous kinematic analysis in healthy controls and in stroke subjects. It is believed that through the understanding of these neural activations, rehabilitation strategies advantaging the principles of plasticity and motor learning will become possible. The present research assessed successful practice conditions promoting gesture learning behavior in the individual. For the first time, functional imaging experiments mapped neural correlates of human interactions with complex virtual reality hands avatars moving synchronously with the subject\u27s own hands, Findings indicate that healthy control subjects learned intransitive gestures in virtual environments using the first and third person avatars, picture and text definitions, and while viewing visual feedback of their own hands, virtual hands avatars, and in the control condition, hidden hands. Moreover, exercise in a virtual environment with a first person avatar of hands recruited insular cortex activation over time, which might indicate that this activation has been associated with a sense of agency. Sensory augmentation in virtual environments modulated activations of important brain regions associated with action observation and action execution. Quality of the visual feedback was modulated and brain areas were identified where the amount of brain activation was positively or negatively correlated with the visual feedback, When subjects moved the right hand and saw unexpected response, the left virtual avatar hand moved, neural activation increased in the motor cortex ipsilateral to the moving hand This visual modulation might provide a helpful rehabilitation therapy for people with paralysis of the limb through visual augmentation of skills. A model was developed to study the effects of sensorimotor experience in virtual environments, and findings of the effect of sensorimotor experience in virtual environments upon brain activity and related behavioral measures. The research model represents a significant contribution to neuroscience research, and translational engineering practice, A model of neural activations correlated with kinematics and behavior can profoundly influence the delivery of rehabilitative services in the coming years by giving clinicians a framework for engaging patients in a sensorimotor environment that can optimally facilitate neural reorganization

    Establishing a Framework for the development of Multimodal Virtual Reality Interfaces with Applicability in Education and Clinical Practice

    Get PDF
    The development of Virtual Reality (VR) and Augmented Reality (AR) content with multiple sources of both input and output has led to countless contributions in a great many number of fields, among which medicine and education. Nevertheless, the actual process of integrating the existing VR/AR media and subsequently setting it to purpose is yet a highly scattered and esoteric undertaking. Moreover, seldom do the architectures that derive from such ventures comprise haptic feedback in their implementation, which in turn deprives users from relying on one of the paramount aspects of human interaction, their sense of touch. Determined to circumvent these issues, the present dissertation proposes a centralized albeit modularized framework that thus enables the conception of multimodal VR/AR applications in a novel and straightforward manner. In order to accomplish this, the aforesaid framework makes use of a stereoscopic VR Head Mounted Display (HMD) from Oculus Rift©, a hand tracking controller from Leap Motion©, a custom-made VR mount that allows for the assemblage of the two preceding peripherals and a wearable device of our own design. The latter is a glove that encompasses two core modules in its innings, one that is able to convey haptic feedback to its wearer and another that deals with the non-intrusive acquisition, processing and registering of his/her Electrocardiogram (ECG), Electromyogram (EMG) and Electrodermal Activity (EDA). The software elements of the aforementioned features were all interfaced through Unity3D©, a powerful game engine whose popularity in academic and scientific endeavors is evermore increasing. Upon completion of our system, it was time to substantiate our initial claim with thoroughly developed experiences that would attest to its worth. With this premise in mind, we devised a comprehensive repository of interfaces, amid which three merit special consideration: Brain Connectivity Leap (BCL), Ode to Passive Haptic Learning (PHL) and a Surgical Simulator

    Soft Gloves: A Review on Recent Developments in Actuation, Sensing, Control and Applications

    Get PDF
    Interest in soft gloves, both robotic and haptic, has enormously grown over the past decade, due to their inherent compliance, which makes them particularly suitable for direct interaction with the human hand. Robotic soft gloves have been developed for hand rehabilitation, for ADLs assistance, or sometimes for both. Haptic soft gloves may be applied in virtual reality (VR) applications or to give sensory feedback in combination with prostheses or to control robots. This paper presents an updated review of the state of the art of soft gloves, with a particular focus on actuation, sensing, and control, combined with a detailed analysis of the devices according to their application field. The review is organized on two levels: a prospective review allows the highlighting of the main trends in soft gloves development and applications, and an analytical review performs an in-depth analysis of the technical solutions developed and implemented in the revised scientific research. Additional minor evaluations integrate the analysis, such as a synthetic investigation of the main results in the clinical studies and trials referred in literature which involve soft gloves

    New generation of wearable goniometers for motion capture systems

    Get PDF
    Background Monitoring joint angles through wearable systems enables human posture and gesture to be reconstructed as a support for physical rehabilitation both in clinics and at the patient's home. A new generation of wearable goniometers based on knitted piezoresistive fabric (KPF) technology is presented. Methods KPF single-and double-layer devices were designed and characterized under stretching and bending to work as strain sensors and goniometers. The theoretical working principle and the derived electromechanical model, previously proved for carbon elastomer sensors, were generalized to KPF. The devices were used to correlate angles and piezoresistive fabric behaviour, to highlight the differences in terms of performance between the single layer and the double layer sensors. A fast calibration procedure is also proposed. Results The proposed device was tested both in static and dynamic conditions in comparison with standard electrogoniometers and inertial measurement units respectively. KPF goniometer capabilities in angle detection were experimentally proved and a discussion of the device measurement errors of is provided. The paper concludes with an analysis of sensor accuracy and hysteresis reduction in particular configurations. Conclusions Double layer KPF goniometers showed a promising performance in terms of angle measurements both in quasi-static and dynamic working mode for velocities typical of human movement. A further approach consisting of a combination of multiple sensors to increase accuracy via sensor fusion technique has been presented

    Personalizing functional Magnetic Resonance Protocols for Studying Neural Substrates of Motor Deficits in Parkinson’s Disease

    Get PDF
    Parkinson’s disease (PD) is a progressive neurodegenerative movement disorder characterized by a large number of motor and non-motor deficits, which significantly contribute to reduced quality of life. Despite the definition of the broad spectrum of clinical characteristics, mechanisms triggering illness, the nature of its progression and a character of therapeutic effects still remain unknown. The enormous advances in magnetic resonance imaging (MRI) in the last decades have significantly affected the research attempts to uncover the functional and structural abnormalities in PD and have helped to develop and monitor various treatment strategies, of which dopamine replacement strategies, mainly in form of levodopa, has been the gold standard since the late seventies and eighties. Motor, task-related functional MRI (fMRI) has been extensively used to assess the pathological state of the motor circuitry in PD. Several studies employed motor paradigms and fMRI to review the functional brain responses of participants to levodopa treatment. Interestingly, they provided conflicting results. Wide spectrum of symptoms, variability and asymmetry of the disease presentation, several treatment approaches and their divergent outcomes make PD enormously heterogeneous. In this work we hypothesized that not considering the disease heterogeneity might have been an adequate cause for the discrepant results in aforementioned studies. We show that not accounting for the disease variability might indeed compromise the results and invalidate the consequent interpretations. Accordingly, we propose and formalize a statistical approach to account for the intra and inter subject variability. This might help to minimize this bias in future motor fMRI studies revealing the functional brain dysfunction and contribute to the understanding of still unknown pathophysiological mechanisms underlying PD

    An integrated system for quantitatively characterizing different handgrips and identifying their cortical substrates

    Get PDF
    Motor recovery of hand function in stroke patients requires months of regular rehabilitation therapy, and is often not measured in a quantitative manner. The first goal of this project was to design a system that can quantitatively track hand movements and, in practice, related changes in hand movements over time. The second goal of this project was to acquire hand and finger movement data during functional imaging (in our case we used magnetoencephalography (MEG)) to be used for characterizing cortical plasticity associated with training. To achieve these goals, for each hand, finger flexion and extension were measured with a data glove and wrist rotation was calculated using an accelerometer. To accomplish the first goal of the project, we designed and implemented Matlab algorithms for the acquisition of behavioral data on different handgrips, specifically power and precision grips. We compiled a set of 52 objects (26 man-made and 26 natural), displayed one at the time on a computer screen, and the subject was asked to form the appropriate handgrip for picking up the object image presented. To accomplish the second goal, we used the setup described above during an MEG scanning session. The timescales for the signals from the glove, accelerometer, and MEG were synchronized and the data analyzed using Brainstorm. We validated proper functionality of the system by demonstrating that the glove and accelerometer data during handgrip formation correspond to the appropriate neural responses

    An integrated system for quantitatively characterizing different handgrips and identifying their cortical substrates

    Get PDF
    Motor recovery of hand function in stroke patients requires months of regular rehabilitation therapy, and is often not measured in a quantitative manner. The first goal of this project was to design a system that can quantitatively track hand movements and, in practice, related changes in hand movements over time. The second goal of this project was to acquire hand and finger movement data during functional imaging (in our case we used magnetoencephalography (MEG)) to be used for characterizing cortical plasticity associated with training. To achieve these goals, for each hand, finger flexion and extension were measured with a data glove and wrist rotation was calculated using an accelerometer. To accomplish the first goal of the project, we designed and implemented Matlab algorithms for the acquisition of behavioral data on different handgrips, specifically power and precision grips. We compiled a set of 52 objects (26 man-made and 26 natural), displayed one at the time on a computer screen, and the subject was asked to form the appropriate handgrip for picking up the object image presented. To accomplish the second goal, we used the setup described above during an MEG scanning session. The timescales for the signals from the glove, accelerometer, and MEG were synchronized and the data analyzed using Brainstorm. We validated proper functionality of the system by demonstrating that the glove and accelerometer data during handgrip formation correspond to the appropriate neural responses

    A virtual hand assessment system for efficient outcome measures of hand rehabilitation

    Get PDF
    Previously held under moratorium from 1st December 2016 until 1st December 2021.Hand rehabilitation is an extremely complex and critical process in the medical rehabilitation field. This is mainly due to the high articulation of the hand functionality. Recent research has focused on employing new technologies, such as robotics and system control, in order to improve the precision and efficiency of the standard clinical methods used in hand rehabilitation. However, the designs of these devices were either oriented toward a particular hand injury or heavily dependent on subjective assessment techniques to evaluate the progress. These limitations reduce the efficiency of the hand rehabilitation devices by providing less effective results for restoring the lost functionalities of the dysfunctional hands. In this project, a novel technological solution and efficient hand assessment system is produced that can objectively measure the restoration outcome and, dynamically, evaluate its performance. The proposed system uses a data glove sensorial device to measure the multiple ranges of motion for the hand joints, and a Virtual Reality system to return an illustrative and safe visual assistance environment that can self-adjust with the subject’s performance. The system application implements an original finger performance measurement method for analysing the various hand functionalities. This is achieved by extracting the multiple features of the hand digits’ motions; such as speed, consistency of finger movements and stability during the hold positions. Furthermore, an advanced data glove calibration method was developed and implemented in order to accurately manipulate the virtual hand model and calculate the hand kinematic movements in compliance with the biomechanical structure of the hand. The experimental studies were performed on a controlled group of 10 healthy subjects (25 to 42 years age). The results showed intra-subject reliability between the trials (average of crosscorrelation ρ = 0.7), inter-subject repeatability across the subject’s performance (p > 0.01 for the session with real objects and with few departures in some of the virtual reality sessions). In addition, the finger performance values were found to be very efficient in detecting the multiple elements of the fingers’ performance including the load effect on the forearm. Moreover, the electromyography measurements, in the virtual reality sessions, showed high sensitivity in detecting the tremor effect (the mean power frequency difference on the right Vextensor digitorum muscle is 176 Hz). Also, the finger performance values for the virtual reality sessions have the same average distance as the real life sessions (RSQ =0.07). The system, besides offering an efficient and quantitative evaluation of hand performance, it was proven compatible with different hand rehabilitation techniques where it can outline the primarily affected parts in the hand dysfunction. It also can be easily adjusted to comply with the subject’s specifications and clinical hand assessment procedures to autonomously detect the classification task events and analyse them with high reliability. The developed system is also adaptable with different disciplines’ involvements, other than the hand rehabilitation, such as ergonomic studies, hand robot control, brain-computer interface and various fields involving hand control.Hand rehabilitation is an extremely complex and critical process in the medical rehabilitation field. This is mainly due to the high articulation of the hand functionality. Recent research has focused on employing new technologies, such as robotics and system control, in order to improve the precision and efficiency of the standard clinical methods used in hand rehabilitation. However, the designs of these devices were either oriented toward a particular hand injury or heavily dependent on subjective assessment techniques to evaluate the progress. These limitations reduce the efficiency of the hand rehabilitation devices by providing less effective results for restoring the lost functionalities of the dysfunctional hands. In this project, a novel technological solution and efficient hand assessment system is produced that can objectively measure the restoration outcome and, dynamically, evaluate its performance. The proposed system uses a data glove sensorial device to measure the multiple ranges of motion for the hand joints, and a Virtual Reality system to return an illustrative and safe visual assistance environment that can self-adjust with the subject’s performance. The system application implements an original finger performance measurement method for analysing the various hand functionalities. This is achieved by extracting the multiple features of the hand digits’ motions; such as speed, consistency of finger movements and stability during the hold positions. Furthermore, an advanced data glove calibration method was developed and implemented in order to accurately manipulate the virtual hand model and calculate the hand kinematic movements in compliance with the biomechanical structure of the hand. The experimental studies were performed on a controlled group of 10 healthy subjects (25 to 42 years age). The results showed intra-subject reliability between the trials (average of crosscorrelation ρ = 0.7), inter-subject repeatability across the subject’s performance (p > 0.01 for the session with real objects and with few departures in some of the virtual reality sessions). In addition, the finger performance values were found to be very efficient in detecting the multiple elements of the fingers’ performance including the load effect on the forearm. Moreover, the electromyography measurements, in the virtual reality sessions, showed high sensitivity in detecting the tremor effect (the mean power frequency difference on the right Vextensor digitorum muscle is 176 Hz). Also, the finger performance values for the virtual reality sessions have the same average distance as the real life sessions (RSQ =0.07). The system, besides offering an efficient and quantitative evaluation of hand performance, it was proven compatible with different hand rehabilitation techniques where it can outline the primarily affected parts in the hand dysfunction. It also can be easily adjusted to comply with the subject’s specifications and clinical hand assessment procedures to autonomously detect the classification task events and analyse them with high reliability. The developed system is also adaptable with different disciplines’ involvements, other than the hand rehabilitation, such as ergonomic studies, hand robot control, brain-computer interface and various fields involving hand control

    Augmented Reality

    Get PDF
    Augmented Reality (AR) is a natural development from virtual reality (VR), which was developed several decades earlier. AR complements VR in many ways. Due to the advantages of the user being able to see both the real and virtual objects simultaneously, AR is far more intuitive, but it's not completely detached from human factors and other restrictions. AR doesn't consume as much time and effort in the applications because it's not required to construct the entire virtual scene and the environment. In this book, several new and emerging application areas of AR are presented and divided into three sections. The first section contains applications in outdoor and mobile AR, such as construction, restoration, security and surveillance. The second section deals with AR in medical, biological, and human bodies. The third and final section contains a number of new and useful applications in daily living and learning

    The selection and evaluation of a sensory technology for interaction in a warehouse environment

    Get PDF
    In recent years, Human-Computer Interaction (HCI) has become a significant part of modern life as it has improved human performance in the completion of daily tasks in using computerised systems. The increase in the variety of bio-sensing and wearable technologies on the market has propelled designers towards designing more efficient, effective and fully natural User-Interfaces (UI), such as the Brain-Computer Interface (BCI) and the Muscle-Computer Interface (MCI). BCI and MCI have been used for various purposes, such as controlling wheelchairs, piloting drones, providing alphanumeric inputs into a system and improving sports performance. Various challenges are experienced by workers in a warehouse environment. Because they often have to carry objects (referred to as hands-full) it is difficult to interact with traditional devices. Noise undeniably exists in some industrial environments and it is known as a major factor that causes communication problems. This has reduced the popularity of using verbal interfaces with computer applications, such as Warehouse Management Systems. Another factor that effects the performance of workers are action slips caused by a lack of concentration during, for example, routine picking activities. This can have a negative impact on job performance and allow a worker to incorrectly execute a task in a warehouse environment. This research project investigated the current challenges workers experience in a warehouse environment and the technologies utilised in this environment. The latest automation and identification systems and technologies are identified and discussed, specifically the technologies which have addressed known problems. Sensory technologies were identified that enable interaction between a human and a computerised warehouse environment. Biological and natural behaviours of humans which are applicable in the interaction with a computerised environment were described and discussed. The interactive behaviours included the visionary, auditory, speech production and physiological movement where other natural human behaviours such paying attention, action slips and the action of counting items were investigated. A number of modern sensory technologies, devices and techniques for HCI were identified with the aim of selecting and evaluating an appropriate sensory technology for MCI. iii MCI technologies enable a computer system to recognise hand and other gestures of a user, creating means of direct interaction between a user and a computer as they are able to detect specific features extracted from a specific biological or physiological activity. Thereafter, Machine Learning (ML) is applied in order to train a computer system to detect these features and convert them to a computer interface. An application of biomedical signals (bio-signals) in HCI using a MYO Armband for MCI is presented. An MCI prototype (MCIp) was developed and implemented to allow a user to provide input to an HCI, in a hands-free and hands-full situation. The MCIp was designed and developed to recognise the hand-finger gestures of a person when both hands are free or when holding an object, such a cardboard box. The MCIp applies an Artificial Neural Network (ANN) to classify features extracted from the surface Electromyography signals acquired by the MYO Armband around the forearm muscle. The MCIp provided the results of data classification for gesture recognition to an accuracy level of 34.87% with a hands-free situation. This was done by employing the ANN. The MCIp, furthermore, enabled users to provide numeric inputs to the MCIp system hands-full with an accuracy of 59.7% after a training session for each gesture of only 10 seconds. The results were obtained using eight participants. Similar experimentation with the MYO Armband has not been found to be reported in any literature at submission of this document. Based on this novel experimentation, the main contribution of this research study is a suggestion that the application of a MYO Armband, as a commercially available muscle-sensing device on the market, has the potential as an MCI to recognise the finger gestures hands-free and hands-full. An accurate MCI can increase the efficiency and effectiveness of an HCI tool when it is applied to different applications in a warehouse where noise and hands-full activities pose a challenge. Future work to improve its accuracy is proposed
    • 

    corecore