42 research outputs found

    Biomechatronics: Harmonizing Mechatronic Systems with Human Beings

    Get PDF
    This eBook provides a comprehensive treatise on modern biomechatronic systems centred around human applications. A particular emphasis is given to exoskeleton designs for assistance and training with advanced interfaces in human-machine interaction. Some of these designs are validated with experimental results which the reader will find very informative as building-blocks for designing such systems. This eBook will be ideally suited to those researching in biomechatronic area with bio-feedback applications or those who are involved in high-end research on manmachine interfaces. This may also serve as a textbook for biomechatronic design at post-graduate level

    The role of sound in robot-assisted hand function training post-stroke

    Get PDF
    In Folge eines Schlaganfalls leiden 90% aller Patienten an einer Handparese, die sich in 30-40% als chronisch manifestiert. Derzeit wĂ€chst seitens der Neurologie und Technologie das Forschungsinteresse an der EffektivitĂ€t robotergestĂŒtzter TherapieansĂ€tze, welche fĂŒr schwer betroffene Patienten als besonders vielversprechend eingestuft werden. Die hierfĂŒr verwendeten Therapieroboter setzen sich aus einem mechanischen Teil und einer softwaregestĂŒtzten virtuellen Umgebung zusammen, welche neben dem graphischen Interface, audio-visuelles Feedback sowie Musik beinhaltet. Bisher wurden Effekte der klanglichen Anteile dieses Szenarios noch nicht hinsichtlich möglicher EinflĂŒsse auf Motivation, BewegungsdurchfĂŒhrung, motorisches Lernen und den gesamten Rehabilitationsprozess untersucht. Die vorliegende Arbeit untersucht die Rolle von Sound in robotergestĂŒtztem Handfunktionstraining. Die Hauptziele im Rahmen dessen sind es, 1) Potentiale von Sound/ Musik fĂŒr den Kontext robotergestĂŒtzten Handfunktionstrainings zu explorieren, 2) spezifizierte klangliche Umgebungen zu entwickeln, 3) zu untersuchen, ob Schlaganfallpatienten von diesen spezifizierten Soundanwendungen profitieren, 4) ein besseres VerstĂ€ndnis ĂŒber Wirkmechanismen von Sound und Musik mit Potential fĂŒr robotergestĂŒtzte Therapie darzulegen, und 5) Folgetechnologien ĂŒber eine effektive Applikation von Sound/ Musik in robotergestĂŒtzter Therapie zu informieren.90% of all stroke survivors suffer from a hand paresis which remains chronic in 30-40% of all cases. Currently, there is an increasing research interest in neurology and technology on the effectiveness of robot-assisted therapies. Robotic training is considered as especially promising for patients suffering from severe limitations. Commonly, rehabilitation robots consist of a mechanical part and a virtual training environment with a graphical user interface, audio-visual feedback, sound, and music. So far, the effects of sound and music that are embedded within these scenarios have never been evaluated in particular while taking into account that it might influence motivation, motor execution, motor learning and the whole recovery process. This thesis investigates the role of sound in robot-assisted hand function training post-stroke. The main goals of this work are 1) to explore potentials of sound/ music for robotic hand function training post-stroke, 2) to develop specified sound-/ music-applications for this context, 3) to examine whether stroke patients benefit from these specified sound/ music-application, 4) to gain a better understanding of sound-/ music-induced mechanisms with therapeutic potentials for robotic therapy, and 5) to inform further arising treatment approaches about effective applications of sound or music in robotic post-stroke motor training

    Virtual reality based upper extremity stroke rehabilitation system.

    Get PDF
    Some studies suggest that the use of Virtual Reality technologies as an assistive technology in combination with conventional therapies can achieve improved results in post stroke rehabilitation. Despite the wealth of ongoing research applied to trying to build a virtual reality based system for upper extremity rehabilitation, there still exists a strong need for a training platform that would provide whole arm rehabilitation. In order to be practical such a system should ideally be low cost (affordable or inexpensive for a common individual or household) and involve minimal therapist involvement. This research outlines some of the applications of virtual reality that have undergone clinical trials with patients suffering from upper extremity functional motor deficits. Furthermore, this thesis presents the design, development, implementation and feasibility testing of a Virtual Reality-based Upper Extremity Stroke Rehabilitation System. Motion sensing technology has been used to capture the real time movement data of the upper extremity and a virtual reality glove has been used to track the flexion/extension of the fingers. A virtual room has been designed with an avatar of the human arm to allow a variety of training tasks to be accomplished. An interface has been established to incorporate the real time data from the hardware to a virtual scene running on a PC. Three different training scenes depicting a real world scenario have been designed. These have been used to analyze the motion patterns of the users while executing the tasks in the virtual environment simulation. A usability study with the healthy volunteers performing the training tasks have been undertaken to study the ease of use, ease of learning and improved motivation in the virtual environment. Moreover this system costing approximately 2725 pounds would provide home based rehabilitation of the whole arm augmenting conventional therapy on a positive level. Statistical analysis of the data and the evaluation studies with the self report methodologies suggests the feasibility of the system for post stroke rehabilitation in home environment

    Human-Machine Interfaces using Distributed Sensing and Stimulation Systems

    Get PDF
    As the technology moves towards more natural human-machine interfaces (e.g. bionic limbs, teleoperation, virtual reality), it is necessary to develop a sensory feedback system in order to foster embodiment and achieve better immersion in the control system. Contemporary feedback interfaces presented in research use few sensors and stimulation units to feedback at most two discrete feedback variables (e.g. grasping force and aperture), whereas the human sense of touch relies on a distributed network of mechanoreceptors providing a wide bandwidth of information. To provide this type of feedback, it is necessary to develop a distributed sensing system that could extract a wide range of information during the interaction between the robot and the environment. In addition, a distributed feedback interface is needed to deliver such information to the user. This thesis proposes the development of a distributed sensing system (e-skin) to acquire tactile sensation, a first integration of distributed sensing system on a robotic hand, the development of a sensory feedback system that compromises the distributed sensing system and a distributed stimulation system, and finally the implementation of deep learning methods for the classification of tactile data. It\u2019s core focus addresses the development and testing of a sensory feedback system, based on the latest distributed sensing and stimulation techniques. To this end, the thesis is comprised of two introductory chapters that describe the state of art in the field, the objectives, and the used methodology and contributions; as well as six studies that tackled the development of human-machine interfaces

    Machine learning-based dexterous control of hand prostheses

    Get PDF
    Upper-limb myoelectric prostheses are controlled by muscle activity information recorded on the skin surface using electromyography (EMG). Intuitive prosthetic control can be achieved by deploying statistical and machine learning (ML) tools to decipher the user’s movement intent from EMG signals. This thesis proposes various means of advancing the capabilities of non-invasive, ML-based control of myoelectric hand prostheses. Two main directions are explored, namely classification-based hand grip selection and proportional finger position control using regression methods. Several practical aspects are considered with the aim of maximising the clinical impact of the proposed methodologies, which are evaluated with offline analyses as well as real-time experiments involving both able-bodied and transradial amputee participants. It has been generally accepted that the EMG signal may not always be a reliable source of control information for prostheses, mainly due to its stochastic and non-stationary properties. One particular issue associated with the use of surface EMG signals for upper-extremity myoelectric control is the limb position effect, which is related to the lack of decoding generalisation under novel arm postures. To address this challenge, it is proposed to make concurrent use of EMG sensors and inertial measurement units (IMUs). It is demonstrated this can lead to a significant improvement in both classification accuracy (CA) and real-time prosthetic control performance. Additionally, the relationship between surface EMG and inertial measurements is investigated and it is found that these modalities are partially related due to reflecting different manifestations of the same underlying phenomenon, that is, the muscular activity. In the field of upper-limb myoelectric control, the linear discriminant analysis (LDA) classifier has arguably been the most popular choice for movement intent decoding. This is mainly attributable to its ease of implementation, low computational requirements, and acceptable decoding performance. Nevertheless, this particular method makes a strong fundamental assumption, that is, data observations from different classes share a common covariance structure. Although this assumption may often be violated in practice, it has been found that the performance of the method is comparable to that of more sophisticated algorithms. In this thesis, it is proposed to remove this assumption by making use of general class-conditional Gaussian models and appropriate regularisation to avoid overfitting issues. By performing an exhaustive analysis on benchmark datasets, it is demonstrated that the proposed approach based on regularised discriminant analysis (RDA) can offer an impressive increase in decoding accuracy. By combining the use of RDA classification with a novel confidence-based rejection policy that intends to minimise the rate of unintended hand motions, it is shown that it is feasible to attain robust myoelectric grip control of a prosthetic hand by making use of a single pair of surface EMG-IMU sensors. Most present-day commercial prosthetic hands offer the mechanical abilities to support individual digit control; however, classification-based methods can only produce pre-defined grip patterns, a feature which results in prosthesis under-actuation. Although classification-based grip control can provide a great advantage over conventional strategies, it is far from being intuitive and natural to the user. A potential way of approaching the level of dexterity enjoyed by the human hand is via continuous and individual control of multiple joints. To this end, an exhaustive analysis is performed on the feasibility of reconstructing multidimensional hand joint angles from surface EMG signals. A supervised method based on the eigenvalue formulation of multiple linear regression (MLR) is then proposed to simultaneously reduce the dimensionality of input and output variables and its performance is compared to that of typically used unsupervised methods, which may produce suboptimal results in this context. An experimental paradigm is finally designed to evaluate the efficacy of the proposed finger position control scheme during real-time prosthesis use. This thesis provides insight into the capacity of deploying a range of computational methods for non-invasive myoelectric control. It contributes towards developing intuitive interfaces for dexterous control of multi-articulated prosthetic hands by transradial amputees

    Restoring Fine Motor Prosthetic Hand Control via Peripheral Neural Technology

    Full text link
    Losing a limb can drastically alter a person’s way of life, and in some cases, brings great financial and emotional burdens. In particular, upper-limb amputations means losing the ability to do many daily activities that are normally simple with intact hands. Prosthesis technology has significantly advanced in the past decade to replicate the mechanical complexity of the human hand. However, current commercial user-to-prosthesis interfaces fail to provide users with full intuitive control over the many functionalities advanced prosthetic hands can offer. Research in developing new interfaces for better motor control has been on the rise, focusing on tapping directly into the peripheral nervous system. The aim of this work is to characterize and validate the properties of a novel peripheral interface called the Regenerative Peripheral Nerve Interface (RPNI) to improve fine motor skills for prosthetic hand control. The first study characterizes the use of RPNI signals for continuous hand control in non-human primates. In two rhesus macaques, we were able to reconstruct continuous finger movement offline with an average correlation of ρ = 0.87 and root mean squared error (RMSE) of 0.12 between actual and predicted position across both macaques. During real-time control, neural control performance was slightly slower but maintained an average target hit success rate of 96.7% compared to physical hand control. The second study presents the viability of the RPNI in humans who have suffered from upper-limb amputations. Three participants with transradial amputations, P1, P2 and P3, underwent surgical implantation of nine, three, and four RPNIs for the treatment of neuroma pain, respectively. In P1 and P2, ultrasound demonstrated strong contractions of P1 and P2’s median RPNIs during flexion of the phantom thumb, and of P1’s ulnar RPNIs during small finger flexion. In P1, the median RPNI and ulnar RPNIs produced electromyography (EMG) signals with a signal-to-noise ratio (SNR) of 4.62 and 3.80 averaged across three recording sessions, respectively. In P2, the median RPNI and ulnar RPNI had an average SNR of 107 and 35.9, respectively, while P3’s median RPNI and ulnar RPNIs had an average SNR of 22.3 and 19.4, respectively. The final study characterizes the capabilities of RPNI signals to predict continuous finger position in human subjects. Two participants, P2 and P3, successfully hit targets during a center-out target task with 92.4 ± 2.3% accuracy, controlling RPNI-driven one DOF finger movements. Comparably, non-RPNI driven finger movement had similar accuracy and performance. Without recalibrating parameter coefficients, no decreasing trend in motor performance was seen for all one DOF finger control across 300 days for P2 and 40 days for P3, suggesting that RPNIs can generate robust control signals from day to day. Lastly, using RPNI-driven control, P2 and P3 successfully manipulated a two DOF virtual and physical thumb with 96.4 ± 2.5% accuracy. These three studies demonstrated: (1) RPNIs provided robust continuous control of one DOF hand movement in non-human primates, an important step for human translation, (2) RPNIs were safely implemented in three participants, showing evidence of contraction and generation of EMG, and (3) in two participants, RPNIs can provide continuous control of one DOF finger movements and two DOF thumb movements. The results presented in this dissertation suggest RPNIs may be a viable option to advance peripheral nerve interfaces into clinical reality and enhance neuroprosthetic technology for people with limb loss.PHDBiomedical EngineeringUniversity of Michigan, Horace H. Rackham School of Graduate Studieshttps://deepblue.lib.umich.edu/bitstream/2027.42/149816/1/philipv_1.pd

    Haptics: Science, Technology, Applications

    Get PDF
    This open access book constitutes the proceedings of the 12th International Conference on Human Haptic Sensing and Touch Enabled Computer Applications, EuroHaptics 2020, held in Leiden, The Netherlands, in September 2020. The 60 papers presented in this volume were carefully reviewed and selected from 111 submissions. The were organized in topical sections on haptic science, haptic technology, and haptic applications. This year's focus is on accessibility

    Low-Cost Objective Measurement of Prehension Skills

    Get PDF
    This thesis aims to explore the feasibility of using low-cost, portable motion capture tools for the quantitative assessment of sequential 'reach-to-grasp' and repetitive 'finger-tapping' movements in neurologically intact and deficit populations, both in clinical and non-clinical settings. The research extends the capabilities of an existing optoelectronic postural sway assessment tool (PSAT) into a more general Boxed Infrared Gross Kinematic Assessment Tool (BIGKAT) to evaluate prehensile control of hand movements outside the laboratory environment. The contributions of this work include the validation of BIGKAT against a high-end motion capture system (Optotrak) for accuracy and precision in tracking kinematic data. BIGKAT was subsequently applied to kinematically resolve prehensile movements, where concurrent recordings with Optotrak demonstrate similar statistically significant results for five kinematic measures, two spatial measures (Maximum Grip Aperture – MGA, Peak Velocity – PV) and three temporal measures (Movement Time – MT, Time to MGA – TMGA, Time to PV – TPV). Regression analysis further establishes a strong relationship between BIGKAT and Optotrak, with nearly unity slope and low y-intercept values. Results showed reliable performance of BIGKAT and its ability to produce similar statistically significant results as Optotrak. BIGKAT was also applied to quantitatively assess bradykinesia in Parkinson's patients during finger-tapping movements. The system demonstrated significant differences between PD patients and healthy controls in key kinematic measures, paving the way for potential clinical applications. The study characterized kinematic differences in prehensile control in different sensory environments using a Virtual Reality head mounted display and finger tracking system (the Leap Motion), emphasizing the importance of sensory information during hand movements. This highlighted the role of hand vision and haptic feedback during initial and final phases of prehensile movement trajectory. The research also explored marker-less pose estimation using deep learning tools, specifically DeepLabCut (DLC), for reach-to-grasp tracking. Despite challenges posed by COVID-19 limitations on data collection, the study showed promise in scaling reaching and grasping components but highlighted the need for diverse datasets to resolve kinematic differences accurately. To facilitate the assessment of prehension activities, an Event Detection Tool (EDT) was developed, providing temporal measures for reaction time, reaching time, transport time, and movement time during object grasping and manipulation. Though initial pilot data was limited, the EDT holds potential for insights into disease progression and movement disorder severity. Overall, this work contributes to the advancement of low-cost, portable solutions for quantitatively assessing upper-limb movements, demonstrating the potential for wider clinical use and guiding future research in the field of human movement analysis
    corecore