377 research outputs found

    Novel Bidirectional Body - Machine Interface to Control Upper Limb Prosthesis

    Get PDF
    Objective. The journey of a bionic prosthetic user is characterized by the opportunities and limitations involved in adopting a device (the prosthesis) that should enable activities of daily living (ADL). Within this context, experiencing a bionic hand as a functional (and, possibly, embodied) limb constitutes the premise for mitigating the risk of its abandonment through the continuous use of the device. To achieve such a result, different aspects must be considered for making the artificial limb an effective support for carrying out ADLs. Among them, intuitive and robust control is fundamental to improving amputees’ quality of life using upper limb prostheses. Still, as artificial proprioception is essential to perceive the prosthesis movement without constant visual attention, a good control framework may not be enough to restore practical functionality to the limb. To overcome this, bidirectional communication between the user and the prosthesis has been recently introduced and is a requirement of utmost importance in developing prosthetic hands. Indeed, closing the control loop between the user and a prosthesis by providing artificial sensory feedback is a fundamental step towards the complete restoration of the lost sensory-motor functions. Within my PhD work, I proposed the development of a more controllable and sensitive human-like hand prosthesis, i.e., the Hannes prosthetic hand, to improve its usability and effectiveness. Approach. To achieve the objectives of this thesis work, I developed a modular and scalable software and firmware architecture to control the Hannes prosthetic multi-Degree of Freedom (DoF) system and to fit all users’ needs (hand aperture, wrist rotation, and wrist flexion in different combinations). On top of this, I developed several Pattern Recognition (PR) algorithms to translate electromyographic (EMG) activity into complex movements. However, stability and repeatability were still unmet requirements in multi-DoF upper limb systems; hence, I started by investigating different strategies to produce a more robust control. To do this, EMG signals were collected from trans-radial amputees using an array of up to six sensors placed over the skin. Secondly, I developed a vibrotactile system to implement haptic feedback to restore proprioception and create a bidirectional connection between the user and the prosthesis. Similarly, I implemented an object stiffness detection to restore tactile sensation able to connect the user with the external word. This closed-loop control between EMG and vibration feedback is essential to implementing a Bidirectional Body - Machine Interface to impact amputees’ daily life strongly. For each of these three activities: (i) implementation of robust pattern recognition control algorithms, (ii) restoration of proprioception, and (iii) restoration of the feeling of the grasped object's stiffness, I performed a study where data from healthy subjects and amputees was collected, in order to demonstrate the efficacy and usability of my implementations. In each study, I evaluated both the algorithms and the subjects’ ability to use the prosthesis by means of the F1Score parameter (offline) and the Target Achievement Control test-TAC (online). With this test, I analyzed the error rate, path efficiency, and time efficiency in completing different tasks. Main results. Among the several tested methods for Pattern Recognition, the Non-Linear Logistic Regression (NLR) resulted to be the best algorithm in terms of F1Score (99%, robustness), whereas the minimum number of electrodes needed for its functioning was determined to be 4 in the conducted offline analyses. Further, I demonstrated that its low computational burden allowed its implementation and integration on a microcontroller running at a sampling frequency of 300Hz (efficiency). Finally, the online implementation allowed the subject to simultaneously control the Hannes prosthesis DoFs, in a bioinspired and human-like way. In addition, I performed further tests with the same NLR-based control by endowing it with closed-loop proprioceptive feedback. In this scenario, the results achieved during the TAC test obtained an error rate of 15% and a path efficiency of 60% in experiments where no sources of information were available (no visual and no audio feedback). Such results demonstrated an improvement in the controllability of the system with an impact on user experience. Significance. The obtained results confirmed the hypothesis of improving robustness and efficiency of a prosthetic control thanks to of the implemented closed-loop approach. The bidirectional communication between the user and the prosthesis is capable to restore the loss of sensory functionality, with promising implications on direct translation in the clinical practice

    Internet of Things for beyond-the-laboratory prosthetics research

    Get PDF
    Research on upper-limb prostheses is typically laboratory-based. Evidence indicates that research has not yet led to prostheses that meet user needs. Inefficient communication loops between users, clinicians and manufacturers limit the amount of quantitative and qualitative data that researchers can use in refining their innovations. This paper offers a first demonstration of an alternative paradigm by which remote, beyond-the-laboratory prosthesis research according to user needs is feasible. Specifically, the proposed Internet of Things setting allows remote data collection, real-time visualization and prosthesis reprogramming through Wi-Fi and a commercial cloud portal. Via a dashboard, the user can adjust the configuration of the device and append contextual information to the prosthetic data. We evaluated this demonstrator in real-time experiments with three able-bodied participants. Results promise the potential of contextual data collection and system update through the internet, which may provide real-life data for algorithm training and reduce the complexity of send-home trials. This article is part of the theme issue ‘Advanced neurotechnologies: translating innovation for health and well-being’

    Grasp Pre-shape Selection by Synthetic Training: Eye-in-hand Shared Control on the Hannes Prosthesis

    Get PDF
    We consider the task of object grasping with a prosthetic hand capable of multiple grasp types. In this setting, communicating the intended grasp type often requires a high user cognitive load which can be reduced adopting shared autonomy frameworks. Among these, so-called eye-in-hand systems automatically control the hand pre-shaping before the grasp, based on visual input coming from a camera on the wrist. In this paper, we present an eye-in-hand learning-based approach for hand pre-shape classification from RGB sequences. Differently from previous work, we design the system to support the possibility to grasp each considered object part with a different grasp type. In order to overcome the lack of data of this kind and reduce the need for tedious data collection sessions for training the system, we devise a pipeline for rendering synthetic visual sequences of hand trajectories. We develop a sensorized setup to acquire real human grasping sequences for benchmarking and show that, compared on practical use cases, models trained with our synthetic dataset achieve better generalization performance than models trained on real data. We finally integrate our model on the Hannes prosthetic hand and show its practical effectiveness. We make publicly available the code and dataset to reproduce the presented results

    An Asynchronous P300-Based Brain-Computer Interface Web Browser for Severely Disabled People

    Get PDF
    This paper presents an electroencephalo- graphic (EEG) P300-based brain–computer interface (BCI) Internet browser. The system uses the “odd-ball” row-col paradigm for generating the P300 evoked potentials on the scalp of the user, which are immediately processed and translated into web browser commands. There were previous approaches for controlling a BCI web browser. However, to the best of our knowledge, none of them was focused on an assistive context, failing to test their applications with a suitable number of end users. In addition, all of them were synchronous applications, where it was necessary to introduce a “read-mode” command in order to avoid a continuous command selection. Thus, the aim of this study is twofold: 1) to test our web browser with a population of multiple sclerosis (MS) patients in order to assess the usefulness of our proposal to meet their daily communication needs; and 2) to overcome the aforementioned limitation by adding a threshold that discerns between control and non-control states, allowing the user to calmly read the web page without undesirable selections. The browser was tested with sixteen MS patients and five healthy volunteers. Both quantitative and qualitative metrics were obtained. MS participants reached an average accuracy of 84.14%, whereas 95.75% was achieved by control subjects. Results show that MS patients can successfully control the BCI web browser, improving their personal autonom

    A brain-computer interface with vibrotactile biofeedback for haptic information

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>It has been suggested that Brain-Computer Interfaces (BCI) may one day be suitable for controlling a neuroprosthesis. For closed-loop operation of BCI, a tactile feedback channel that is compatible with neuroprosthetic applications is desired. Operation of an EEG-based BCI using only <it>vibrotactile feedback</it>, a commonly used method to convey haptic senses of contact and pressure, is demonstrated with a high level of accuracy.</p> <p>Methods</p> <p>A Mu-rhythm based BCI using a motor imagery paradigm was used to control the position of a virtual cursor. The cursor position was shown visually as well as transmitted haptically by modulating the intensity of a vibrotactile stimulus to the upper limb. A total of six subjects operated the BCI in a two-stage targeting task, receiving only vibrotactile biofeedback of performance. The location of the vibration was also systematically varied between the left and right arms to investigate location-dependent effects on performance.</p> <p>Results and Conclusion</p> <p>Subjects are able to control the BCI using only vibrotactile feedback with an average accuracy of 56% and as high as 72%. These accuracies are significantly higher than the 15% predicted by random chance if the subject had no voluntary control of their Mu-rhythm. The results of this study demonstrate that vibrotactile feedback is an effective biofeedback modality to operate a BCI using motor imagery. In addition, the study shows that placement of the vibrotactile stimulation on the biceps ipsilateral or contralateral to the motor imagery introduces a significant bias in the BCI accuracy. This bias is consistent with a drop in performance generated by stimulation of the contralateral limb. Users demonstrated the capability to overcome this bias with training.</p

    Prototypical Arm Motions from Human Demonstration for Upper-Limb Prosthetic Device Control

    Get PDF
    Controlling a complex upper limb prosthesis, akin to a healthy arm, is still an open challenge due to the inadequate number of inputs available to amputees. Designs have therefore largely focused on a limited number of controllable degrees of freedom, developing a complex hand and grasp functionality rather than the wrist. This thesis investigates joint coordination based on human demonstrations that aims to vastly simplify the controls of wrist, elbow-wrist, and shoulder-elbow wrist devices.The wide range of motions performed by the human arm during daily tasks makes it desirable to find representative subsets to reduce the dimensionality of these movements for a variety of applications, including the design and control of robotic and prosthetic devices. Here I present the results of an extensive human subjects study and two methods that were used to obtain representative categories of arm use that span naturalistic motions during activities of daily living. First, I sought to identify sets of prototypical upper-limb motions that are functions of a single variable, allowing, for instance, an entire prosthetic or robotic arm to be controlled with a single input from a user, along with a means to select between motions for different tasks. Second, I decouple the orientation from the location of the hand and analyze the hand location in three ways and orientation in three reference frames. Both of these analyses are an application of data driven approaches that reduce the wide range of hand and arm use to a smaller representative set. Together these provide insight into our arm usage in daily life and inform an implementation in prosthetic or robotic devices without the need for additional hardware. To demonstrate the control efficacy of prototypical arm motions in upper-limb prosthetic devices, I developed an immersive virtual reality environment where able-bodied participants tested out different devices and controls. I coined prototypical arm motion control as trajectory control, and I found that as device complexity increased from 3 DOF wrist to 4 DOF elbow-wrist and 7 DOF shoulder-elbow-wrist, it enables users to complete tasks faster with a more intuitive interface without additional body compensation, while featuring better movement cosmesis when compared to standard controls

    A Human-Centric Metaverse Enabled by Brain-Computer Interface: A Survey

    Full text link
    The growing interest in the Metaverse has generated momentum for members of academia and industry to innovate toward realizing the Metaverse world. The Metaverse is a unique, continuous, and shared virtual world where humans embody a digital form within an online platform. Through a digital avatar, Metaverse users should have a perceptual presence within the environment and can interact and control the virtual world around them. Thus, a human-centric design is a crucial element of the Metaverse. The human users are not only the central entity but also the source of multi-sensory data that can be used to enrich the Metaverse ecosystem. In this survey, we study the potential applications of Brain-Computer Interface (BCI) technologies that can enhance the experience of Metaverse users. By directly communicating with the human brain, the most complex organ in the human body, BCI technologies hold the potential for the most intuitive human-machine system operating at the speed of thought. BCI technologies can enable various innovative applications for the Metaverse through this neural pathway, such as user cognitive state monitoring, digital avatar control, virtual interactions, and imagined speech communications. This survey first outlines the fundamental background of the Metaverse and BCI technologies. We then discuss the current challenges of the Metaverse that can potentially be addressed by BCI, such as motion sickness when users experience virtual environments or the negative emotional states of users in immersive virtual applications. After that, we propose and discuss a new research direction called Human Digital Twin, in which digital twins can create an intelligent and interactable avatar from the user's brain signals. We also present the challenges and potential solutions in synchronizing and communicating between virtual and physical entities in the Metaverse

    Polarity Sensitivity as a Potential Correlate of Neural Degeneration in Cochlear Implant Users.

    Get PDF
    Cochlear implant (CI) performance varies dramatically between subjects. Although the causes of this variability remain unclear, the electrode-neuron interface is thought to play an important role. Here we evaluate the contribution of two parameters of this interface on the perception of CI listeners: the electrode-to-modiolar wall distance (EMD), estimated from cone-beam computed tomography (CT) scans, and a measure of neural health. Since there is no objective way to quantify neural health in CI users, we measure stimulus polarity sensitivity, which is assumed to be related to neural degeneration, and investigate whether it also correlates with subjects' performance in speech recognition and spectro-temporal modulation detection tasks. Detection thresholds were measured in fifteen CI users (sixteen ears) for partial-tripolar triphasic pulses having an anodic or a cathodic central phase. The polarity effect was defined as the difference in threshold between cathodic and anodic stimuli. Our results show that both the EMD and the polarity effect correlate with detection thresholds, both across and within subjects, although the within-subject correlations were weak. Furthermore, the mean polarity effect, averaged across all electrodes for each subject, was negatively correlated with performance on a spectro-temporal modulation detection task. In other words, lower cathodic thresholds were associated with better spectro-temporal modulation detection performance, which is also consistent with polarity sensitivity being a marker of neural degeneration. Implications for the design of future subject-specific fitting strategies are discussed
    corecore