384 research outputs found

    Sensorimotor experience in virtual environments

    Get PDF
    The goal of rehabilitation is to reduce impairment and provide functional improvements resulting in quality participation in activities of life, Plasticity and motor learning principles provide inspiration for therapeutic interventions including movement repetition in a virtual reality environment, The objective of this research work was to investigate functional specific measurements (kinematic, behavioral) and neural correlates of motor experience of hand gesture activities in virtual environments stimulating sensory experience (VE) using a hand agent model. The fMRI compatible Virtual Environment Sign Language Instruction (VESLI) System was designed and developed to provide a number of rehabilitation and measurement features, to identify optimal learning conditions for individuals and to track changes in performance over time. Therapies and measurements incorporated into VESLI target and track specific impairments underlying dysfunction. The goal of improved measurement is to develop targeted interventions embedded in higher level tasks and to accurately track specific gains to understand the responses to treatment, and the impact the response may have upon higher level function such as participation in life. To further clarify the biological model of motor experiences and to understand the added value and role of virtual sensory stimulation and feedback which includes seeing one\u27s own hand movement, functional brain mapping was conducted with simultaneous kinematic analysis in healthy controls and in stroke subjects. It is believed that through the understanding of these neural activations, rehabilitation strategies advantaging the principles of plasticity and motor learning will become possible. The present research assessed successful practice conditions promoting gesture learning behavior in the individual. For the first time, functional imaging experiments mapped neural correlates of human interactions with complex virtual reality hands avatars moving synchronously with the subject\u27s own hands, Findings indicate that healthy control subjects learned intransitive gestures in virtual environments using the first and third person avatars, picture and text definitions, and while viewing visual feedback of their own hands, virtual hands avatars, and in the control condition, hidden hands. Moreover, exercise in a virtual environment with a first person avatar of hands recruited insular cortex activation over time, which might indicate that this activation has been associated with a sense of agency. Sensory augmentation in virtual environments modulated activations of important brain regions associated with action observation and action execution. Quality of the visual feedback was modulated and brain areas were identified where the amount of brain activation was positively or negatively correlated with the visual feedback, When subjects moved the right hand and saw unexpected response, the left virtual avatar hand moved, neural activation increased in the motor cortex ipsilateral to the moving hand This visual modulation might provide a helpful rehabilitation therapy for people with paralysis of the limb through visual augmentation of skills. A model was developed to study the effects of sensorimotor experience in virtual environments, and findings of the effect of sensorimotor experience in virtual environments upon brain activity and related behavioral measures. The research model represents a significant contribution to neuroscience research, and translational engineering practice, A model of neural activations correlated with kinematics and behavior can profoundly influence the delivery of rehabilitative services in the coming years by giving clinicians a framework for engaging patients in a sensorimotor environment that can optimally facilitate neural reorganization

    Neuronal correlates of continuous manual tracking under varying visual movement feedback in a virtual reality environment

    Get PDF
    To accurately guide one's actions online, the brain predicts sensory action feedback ahead of time based on internal models, which can be updated by sensory prediction errors. The underlying operations can be experimentally investigated in sensorimotor adaptation tasks, in which moving under perturbed sensory action feedback requires internal model updates. Here we altered healthy participants’ visual hand movement feedback in a virtual reality setup, while assessing brain activity with functional magnetic resonance imaging (fMRI). Participants tracked a continually moving virtual target object with a photorealistic, three-dimensional (3D) virtual hand controlled online via a data glove. During the continuous tracking task, the virtual hand's movements (i.e., visual movement feedback) were repeatedly periodically delayed, which participants had to compensate for to maintain accurate tracking. This realistic task design allowed us to simultaneously investigate processes likely operating at several levels of the brain's motor control hierarchy. FMRI revealed that the length of visual feedback delay was parametrically reflected by activity in the inferior parietal cortex and posterior temporal cortex. Unpredicted changes in visuomotor mapping (at transitions from synchronous to delayed visual feedback periods or vice versa) activated biological motion-sensitive regions in the lateral occipitotemporal cortex (LOTC). Activity in the posterior parietal cortex (PPC), focused on the contralateral anterior intraparietal sulcus (aIPS), correlated with tracking error, whereby this correlation was stronger in participants with higher tracking performance. Our results are in line with recent proposals of a wide- spread cortical motor control hierarchy, where temporoparietal regions seem to evaluate visuomotor congruence and thus possibly ground a self-attribution of movements, the LOTC likely processes early visual prediction errors, and the aIPS computes action goal errors and possibly corresponding motor corrections

    3D printing and immersive visualization for improved perception of ancient artifacts

    Get PDF
    This article investigates the use of 3D immersive virtual environments and 3D prints for interaction with past material culture over traditional observation without manipulation. Our work is motivated by studies in heritage, museum, and cognitive sciences indicating the importance of object manipulation for understanding present and ancient artifacts. While virtual immersive environments and 3D prints have started to be incorporated in heritage research and museum displays as a way to provide improved manipulation experiences, little is known about how these new technologies affect the perception of our past. This article provides first results obtained with three experiments designed to investigate the benefits and tradeoffs in using these technologies. Our results indicate that traditional museum displays limit the experience with past material culture, and reveal how our sample of participants favor tactile and immersive 3D virtual experiences with artifacts over visual non-manipulative experiences with authentic objects. This paper is part of a larger study on how people perceive ancient artifacts, which was partially funded by the University of California Humanities Network and the Center for the Humanities at the University of California, Merced.This is the author accepted manuscript. The final version is available from MIT Press via http://dx.doi.org/10.1162/PRES_a_0022

    A comparative analysis of haptic and EEG devices for evaluation and training of post-stroke patients within a virtual environment

    Get PDF
    Virtual Rehabilitation benefits from the usage of interfaces other than the mouse and keyboard, but also possess disadvantages: haptic peripherals can utilize the subject\u27s hand to provide position information or joint angles, and allow direct training for specific movements; but can also place unneeded strain on the limbs; brain-machine interfaces (BMI) can provide direct connections from the user to external hardware or software, but are currently inaccurate for the full diversity of user movements in daily life and require invasive surgery to implement. A compromise between these two extremes is a BMI that can be adapted to specific users, can function with a wide range of hardware and software, and is both noninvasive and convenient to wear for extended periods of time. A suitable BMI using Electroencephalography (EEG) input, known as the Emotiv EPOC™ by Emotiv Systems was evaluated using multiple input specializations and tested with an external robotic arm to determine if it was suitable for control of peripherals. Users were given a preset periodicity to follow in order to evaluate their ability to translate specific facial movements into commands as well as their responsiveness to change the robot arm\u27s direction. Within 2 weeks of training, they maintained or improved axial control of the robot arm, and reduced their overall performance time. Although the EPOC™ does require further testing and development, its adaptability to multiple software programs, users and peripherals allows it to serve both Virtual Rehabilitation and device control in the immediate future

    Establishing a Framework for the development of Multimodal Virtual Reality Interfaces with Applicability in Education and Clinical Practice

    Get PDF
    The development of Virtual Reality (VR) and Augmented Reality (AR) content with multiple sources of both input and output has led to countless contributions in a great many number of fields, among which medicine and education. Nevertheless, the actual process of integrating the existing VR/AR media and subsequently setting it to purpose is yet a highly scattered and esoteric undertaking. Moreover, seldom do the architectures that derive from such ventures comprise haptic feedback in their implementation, which in turn deprives users from relying on one of the paramount aspects of human interaction, their sense of touch. Determined to circumvent these issues, the present dissertation proposes a centralized albeit modularized framework that thus enables the conception of multimodal VR/AR applications in a novel and straightforward manner. In order to accomplish this, the aforesaid framework makes use of a stereoscopic VR Head Mounted Display (HMD) from Oculus Rift©, a hand tracking controller from Leap Motion©, a custom-made VR mount that allows for the assemblage of the two preceding peripherals and a wearable device of our own design. The latter is a glove that encompasses two core modules in its innings, one that is able to convey haptic feedback to its wearer and another that deals with the non-intrusive acquisition, processing and registering of his/her Electrocardiogram (ECG), Electromyogram (EMG) and Electrodermal Activity (EDA). The software elements of the aforementioned features were all interfaced through Unity3D©, a powerful game engine whose popularity in academic and scientific endeavors is evermore increasing. Upon completion of our system, it was time to substantiate our initial claim with thoroughly developed experiences that would attest to its worth. With this premise in mind, we devised a comprehensive repository of interfaces, amid which three merit special consideration: Brain Connectivity Leap (BCL), Ode to Passive Haptic Learning (PHL) and a Surgical Simulator

    Cortical beta oscillations reflect the contextual gating of visual action feedback

    Get PDF
    In sensorimotor integration, the brain needs to decide how its predictions should accommodate novel evidence by 'gating' sensory data depending on the current context. Here, we examined the oscillatory correlates of this process by recording magnetoencephalography (MEG) data during a new task requiring action under intersensory conflict. We used virtual reality to decouple visual (virtual) and proprioceptive (real) hand postures during a task in which the phase of grasping movements tracked a target (in either modality). Thus, we rendered visual information either task-relevant or a (to-be-ignored) distractor. Under visuo-proprioceptive incongruence, occipital beta power decreased (relative to congruence) when vision was task-relevant but increased when it had to be ignored. Dynamic causal modelling (DCM) revealed that this interaction was best explained by diametrical, task-dependent changes in visual gain. These novel results suggest a crucial role for beta oscillations in the contextual gating (i.e., gain or precision control) of visual vs proprioceptive action feedback, depending on concurrent behavioral demands

    An Instrumented Glove for Restoring Sensorimotor Function of the Hand through Augmented Sensory Feedback

    Get PDF
    The loss of sensitivity of the upper limb due to neurological injuries severely limits the ability to manipulate objects, hindering personal independence. Non-invasive augmented sensory feedback techniques are used to promote neural plasticity hence to restore the grasping function. This work presents a wearable device for restoring sensorimotor hand functions based on Discrete Event-driven Sensory Control policy. It consists of an instrumented glove that, relying on piezoelectric sensors, delivers short-lasting vibrotactile stimuli synchronously with the relevant mechanical events (i.e., contact and release) of the manipulation. We first performed a feasibility study on healthy participants (20) that showed overall good performances of the device, with touch-event detection accuracy of 96.2% and a response delay of 22 ms. Later, we pilot tested it on two participants with limited sensorimotor functions. When using the device, they improved their hand motor coordination while performing tests for hand motor coordination assessment (i.e., pick and place test, pick and lift test). In particular, they exhibited more coordinated temporal correlations between grip force and load force profiles and enhanced performances when transferring objects, quantitatively proving the effectiveness of the device

    Soft Gloves: A Review on Recent Developments in Actuation, Sensing, Control and Applications

    Get PDF
    Interest in soft gloves, both robotic and haptic, has enormously grown over the past decade, due to their inherent compliance, which makes them particularly suitable for direct interaction with the human hand. Robotic soft gloves have been developed for hand rehabilitation, for ADLs assistance, or sometimes for both. Haptic soft gloves may be applied in virtual reality (VR) applications or to give sensory feedback in combination with prostheses or to control robots. This paper presents an updated review of the state of the art of soft gloves, with a particular focus on actuation, sensing, and control, combined with a detailed analysis of the devices according to their application field. The review is organized on two levels: a prospective review allows the highlighting of the main trends in soft gloves development and applications, and an analytical review performs an in-depth analysis of the technical solutions developed and implemented in the revised scientific research. Additional minor evaluations integrate the analysis, such as a synthetic investigation of the main results in the clinical studies and trials referred in literature which involve soft gloves
    • …
    corecore