128 research outputs found

    Multimodal human hand motion sensing and analysis - a review

    Get PDF

    The role of morphology of the thumb in anthropomorphic grasping : a review

    Get PDF
    The unique musculoskeletal structure of the human hand brings in wider dexterous capabilities to grasp and manipulate a repertoire of objects than the non-human primates. It has been widely accepted that the orientation and the position of the thumb plays an important role in this characteristic behavior. There have been numerous attempts to develop anthropomorphic robotic hands with varying levels of success. Nevertheless, manipulation ability in those hands is to be ameliorated even though they can grasp objects successfully. An appropriate model of the thumb is important to manipulate the objects against the fingers and to maintain the stability. Modeling these complex interactions about the mechanical axes of the joints and how to incorporate these joints in robotic thumbs is a challenging task. This article presents a review of the biomechanics of the human thumb and the robotic thumb designs to identify opportunities for future anthropomorphic robotic hands

    Distributed Sensing and Stimulation Systems Towards Sense of Touch Restoration in Prosthetics

    Get PDF
    Modern prostheses aim at restoring the functional and aesthetic characteristics of the lost limb. To foster prosthesis embodiment and functionality, it is necessary to restitute both volitional control and sensory feedback. Contemporary feedback interfaces presented in research use few sensors and stimulation units to feedback at most two discrete feedback variables (e.g. grasping force and aperture), whereas the human sense of touch relies on a distributed network of mechanoreceptors providing high-fidelity spatial information. To provide this type of feedback in prosthetics, it is necessary to sense tactile information from artificial skin placed on the prosthesis and transmit tactile feedback above the amputation in order to map the interaction between the prosthesis and the environment. This thesis proposes the integration of distributed sensing systems (e-skin) to acquire tactile sensation, and non-invasive multichannel electrotactile feedback and virtual reality to deliver high-bandwidth information to the user. Its core focus addresses the development and testing of close-loop sensory feedback human-machine interface, based on the latest distributed sensing and stimulation techniques for restoring the sense of touch in prosthetics. To this end, the thesis is comprised of two introductory chapters that describe the state of art in the field, the objectives and the used methodology and contributions; as well as three studies distributed over stimulation system level and sensing system level. The first study presents the development of close-loop compensatory tracking system to evaluate the usability and effectiveness of electrotactile sensory feedback in enabling real-time close-loop control in prosthetics. It examines and compares the subject\u2019s adaptive performance and tolerance to random latencies while performing the dynamic control task (i.e. position control) and simultaneously receiving either visual feedback or electrotactile feedback for communicating the momentary tracking error. Moreover, it reported the minimum time delay needed for an abrupt impairment of users\u2019 performance. The experimental results have shown that electrotactile feedback performance is less prone to changes with longer delays. However, visual feedback drops faster than electrotactile with increased time delays. This is a good indication for the effectiveness of electrotactile feedback in enabling close- loop control in prosthetics, since some delays are inevitable. The second study describes the development of a novel non-invasive compact multichannel interface for electrotactile feedback, containing 24 pads electrode matrix, with fully programmable stimulation unit, that investigates the ability of able-bodied human subjects to localize the electrotactile stimulus delivered through the electrode matrix. Furthermore, it designed a novel dual parameter -modulation (interleaved frequency and intensity) and compared it to conventional stimulation (same frequency for all pads). In addition and for the first time, it compared the electrotactile stimulation to mechanical stimulation. More, it exposes the integration of virtual prosthesis with the developed system in order to achieve better user experience and object manipulation through mapping the acquired real-time collected tactile data and feedback it simultaneously to the user. The experimental results demonstrated that the proposed interleaved coding substantially improved the spatial localization compared to same-frequency stimulation. Furthermore, it showed that same-frequency stimulation was equivalent to mechanical stimulation, whereas the performance with dual-parameter modulation was significantly better. The third study presents the realization of a novel, flexible, screen- printed e-skin based on P(VDF-TrFE) piezoelectric polymers, that would cover the fingertips and the palm of the prosthetic hand (particularly the Michelangelo hand by Ottobock) and an assistive sensorized glove for stroke patients. Moreover, it developed a new validation methodology to examine the sensors behavior while being solicited. The characterization results showed compatibility between the expected (modeled) behavior of the electrical response of each sensor to measured mechanical (normal) force at the skin surface, which in turn proved the combination of both fabrication and assembly processes was successful. This paves the way to define a practical, simplified and reproducible characterization protocol for e-skin patches In conclusion, by adopting innovative methodologies in sensing and stimulation systems, this thesis advances the overall development of close-loop sensory feedback human-machine interface used for restoration of sense of touch in prosthetics. Moreover, this research could lead to high-bandwidth high-fidelity transmission of tactile information for modern dexterous prostheses that could ameliorate the end user experience and facilitate it acceptance in the daily life

    Designing Prosthetic Hands With Embodied Intelligence: The KIT Prosthetic Hands

    Get PDF
    Hand prostheses should provide functional replacements of lost hands. Yet current prosthetic hands often are not intuitive to control and easy to use by amputees. Commercially available prostheses are usually controlled based on EMG signals triggered by the user to perform grasping tasks. Such EMG-based control requires long training and depends heavily on the robustness of the EMG signals. Our goal is to develop prosthetic hands with semi-autonomous grasping abilities that lead to more intuitive control by the user. In this paper, we present the development of prosthetic hands that enable such abilities as first results toward this goal. The developed prostheses provide intelligent mechatronics including adaptive actuation, multi-modal sensing and on-board computing resources to enable autonomous and intuitive control. The hands are scalable in size and based on an underactuated mechanism which allows the adaptation of grasps to the shape of arbitrary objects. They integrate a multi-modal sensor system including a camera and in the newest version a distance sensor and IMU. A resource-aware embedded system for in-hand processing of sensory data and control is included in the palm of each hand. We describe the design of the new version of the hands, the female hand prosthesis with a weight of 377 g, a grasping force of 40.5 N and closing time of 0.73 s. We evaluate the mechatronics of the hand, its grasping abilities based on the YCB Gripper Assessment Protocol as well as a task-oriented protocol for assessing the hand performance in activities of daily living. Further, we exemplarily show the suitability of the multi-modal sensor system for sensory-based, semi-autonomous grasping in daily life activities. The evaluation demonstrates the merit of the hand concept, its sensor and in-hand computing systems

    Fused mechanomyography and inertial measurement for human-robot interface

    Get PDF
    Human-Machine Interfaces (HMI) are the technology through which we interact with the ever-increasing quantity of smart devices surrounding us. The fundamental goal of an HMI is to facilitate robot control through uniting a human operator as the supervisor with a machine as the task executor. Sensors, actuators, and onboard intelligence have not reached the point where robotic manipulators may function with complete autonomy and therefore some form of HMI is still necessary in unstructured environments. These may include environments where direct human action is undesirable or infeasible, and situations where a robot must assist and/or interface with people. Contemporary literature has introduced concepts such as body-worn mechanical devices, instrumented gloves, inertial or electromagnetic motion tracking sensors on the arms, head, or legs, electroencephalographic (EEG) brain activity sensors, electromyographic (EMG) muscular activity sensors and camera-based (vision) interfaces to recognize hand gestures and/or track arm motions for assessment of operator intent and generation of robotic control signals. While these developments offer a wealth of future potential their utility has been largely restricted to laboratory demonstrations in controlled environments due to issues such as lack of portability and robustness and an inability to extract operator intent for both arm and hand motion. Wearable physiological sensors hold particular promise for capture of human intent/command. EMG-based gesture recognition systems in particular have received significant attention in recent literature. As wearable pervasive devices, they offer benefits over camera or physical input systems in that they neither inhibit the user physically nor constrain the user to a location where the sensors are deployed. Despite these benefits, EMG alone has yet to demonstrate the capacity to recognize both gross movement (e.g. arm motion) and finer grasping (e.g. hand movement). As such, many researchers have proposed fusing muscle activity (EMG) and motion tracking e.g. (inertial measurement) to combine arm motion and grasp intent as HMI input for manipulator control. However, such work has arguably reached a plateau since EMG suffers from interference from environmental factors which cause signal degradation over time, demands an electrical connection with the skin, and has not demonstrated the capacity to function out of controlled environments for long periods of time. This thesis proposes a new form of gesture-based interface utilising a novel combination of inertial measurement units (IMUs) and mechanomyography sensors (MMGs). The modular system permits numerous configurations of IMU to derive body kinematics in real-time and uses this to convert arm movements into control signals. Additionally, bands containing six mechanomyography sensors were used to observe muscular contractions in the forearm which are generated using specific hand motions. This combination of continuous and discrete control signals allows a large variety of smart devices to be controlled. Several methods of pattern recognition were implemented to provide accurate decoding of the mechanomyographic information, including Linear Discriminant Analysis and Support Vector Machines. Based on these techniques, accuracies of 94.5% and 94.6% respectively were achieved for 12 gesture classification. In real-time tests, accuracies of 95.6% were achieved in 5 gesture classification. It has previously been noted that MMG sensors are susceptible to motion induced interference. The thesis also established that arm pose also changes the measured signal. This thesis introduces a new method of fusing of IMU and MMG to provide a classification that is robust to both of these sources of interference. Additionally, an improvement in orientation estimation, and a new orientation estimation algorithm are proposed. These improvements to the robustness of the system provide the first solution that is able to reliably track both motion and muscle activity for extended periods of time for HMI outside a clinical environment. Application in robot teleoperation in both real-world and virtual environments were explored. With multiple degrees of freedom, robot teleoperation provides an ideal test platform for HMI devices, since it requires a combination of continuous and discrete control signals. The field of prosthetics also represents a unique challenge for HMI applications. In an ideal situation, the sensor suite should be capable of detecting the muscular activity in the residual limb which is naturally indicative of intent to perform a specific hand pose and trigger this post in the prosthetic device. Dynamic environmental conditions within a socket such as skin impedance have delayed the translation of gesture control systems into prosthetic devices, however mechanomyography sensors are unaffected by such issues. There is huge potential for a system like this to be utilised as a controller as ubiquitous computing systems become more prevalent, and as the desire for a simple, universal interface increases. Such systems have the potential to impact significantly on the quality of life of prosthetic users and others.Open Acces

    Novel Bidirectional Body - Machine Interface to Control Upper Limb Prosthesis

    Get PDF
    Objective. The journey of a bionic prosthetic user is characterized by the opportunities and limitations involved in adopting a device (the prosthesis) that should enable activities of daily living (ADL). Within this context, experiencing a bionic hand as a functional (and, possibly, embodied) limb constitutes the premise for mitigating the risk of its abandonment through the continuous use of the device. To achieve such a result, different aspects must be considered for making the artificial limb an effective support for carrying out ADLs. Among them, intuitive and robust control is fundamental to improving amputees’ quality of life using upper limb prostheses. Still, as artificial proprioception is essential to perceive the prosthesis movement without constant visual attention, a good control framework may not be enough to restore practical functionality to the limb. To overcome this, bidirectional communication between the user and the prosthesis has been recently introduced and is a requirement of utmost importance in developing prosthetic hands. Indeed, closing the control loop between the user and a prosthesis by providing artificial sensory feedback is a fundamental step towards the complete restoration of the lost sensory-motor functions. Within my PhD work, I proposed the development of a more controllable and sensitive human-like hand prosthesis, i.e., the Hannes prosthetic hand, to improve its usability and effectiveness. Approach. To achieve the objectives of this thesis work, I developed a modular and scalable software and firmware architecture to control the Hannes prosthetic multi-Degree of Freedom (DoF) system and to fit all users’ needs (hand aperture, wrist rotation, and wrist flexion in different combinations). On top of this, I developed several Pattern Recognition (PR) algorithms to translate electromyographic (EMG) activity into complex movements. However, stability and repeatability were still unmet requirements in multi-DoF upper limb systems; hence, I started by investigating different strategies to produce a more robust control. To do this, EMG signals were collected from trans-radial amputees using an array of up to six sensors placed over the skin. Secondly, I developed a vibrotactile system to implement haptic feedback to restore proprioception and create a bidirectional connection between the user and the prosthesis. Similarly, I implemented an object stiffness detection to restore tactile sensation able to connect the user with the external word. This closed-loop control between EMG and vibration feedback is essential to implementing a Bidirectional Body - Machine Interface to impact amputees’ daily life strongly. For each of these three activities: (i) implementation of robust pattern recognition control algorithms, (ii) restoration of proprioception, and (iii) restoration of the feeling of the grasped object's stiffness, I performed a study where data from healthy subjects and amputees was collected, in order to demonstrate the efficacy and usability of my implementations. In each study, I evaluated both the algorithms and the subjects’ ability to use the prosthesis by means of the F1Score parameter (offline) and the Target Achievement Control test-TAC (online). With this test, I analyzed the error rate, path efficiency, and time efficiency in completing different tasks. Main results. Among the several tested methods for Pattern Recognition, the Non-Linear Logistic Regression (NLR) resulted to be the best algorithm in terms of F1Score (99%, robustness), whereas the minimum number of electrodes needed for its functioning was determined to be 4 in the conducted offline analyses. Further, I demonstrated that its low computational burden allowed its implementation and integration on a microcontroller running at a sampling frequency of 300Hz (efficiency). Finally, the online implementation allowed the subject to simultaneously control the Hannes prosthesis DoFs, in a bioinspired and human-like way. In addition, I performed further tests with the same NLR-based control by endowing it with closed-loop proprioceptive feedback. In this scenario, the results achieved during the TAC test obtained an error rate of 15% and a path efficiency of 60% in experiments where no sources of information were available (no visual and no audio feedback). Such results demonstrated an improvement in the controllability of the system with an impact on user experience. Significance. The obtained results confirmed the hypothesis of improving robustness and efficiency of a prosthetic control thanks to of the implemented closed-loop approach. The bidirectional communication between the user and the prosthesis is capable to restore the loss of sensory functionality, with promising implications on direct translation in the clinical practice

    Synergy-Based Human Grasp Representations and Semi-Autonomous Control of Prosthetic Hands

    Get PDF
    Das sichere und stabile Greifen mit humanoiden Roboterhänden stellt eine große Herausforderung dar. Diese Dissertation befasst sich daher mit der Ableitung von Greifstrategien für Roboterhände aus der Beobachtung menschlichen Greifens. Dabei liegt der Fokus auf der Betrachtung des gesamten Greifvorgangs. Dieser umfasst zum einen die Hand- und Fingertrajektorien während des Greifprozesses und zum anderen die Kontaktpunkte sowie den Kraftverlauf zwischen Hand und Objekt vom ersten Kontakt bis zum statisch stabilen Griff. Es werden nichtlineare posturale Synergien und Kraftsynergien menschlicher Griffe vorgestellt, die die Generierung menschenähnlicher Griffposen und Griffkräfte erlauben. Weiterhin werden Synergieprimitive als adaptierbare Repräsentation menschlicher Greifbewegungen entwickelt. Die beschriebenen, vom Menschen gelernten Greifstrategien werden für die Steuerung robotischer Prothesenhände angewendet. Im Rahmen einer semi-autonomen Steuerung werden menschenähnliche Greifbewegungen situationsgerecht vorgeschlagen und vom Nutzenden der Prothese überwacht

    A virtual hand assessment system for efficient outcome measures of hand rehabilitation

    Get PDF
    Previously held under moratorium from 1st December 2016 until 1st December 2021.Hand rehabilitation is an extremely complex and critical process in the medical rehabilitation field. This is mainly due to the high articulation of the hand functionality. Recent research has focused on employing new technologies, such as robotics and system control, in order to improve the precision and efficiency of the standard clinical methods used in hand rehabilitation. However, the designs of these devices were either oriented toward a particular hand injury or heavily dependent on subjective assessment techniques to evaluate the progress. These limitations reduce the efficiency of the hand rehabilitation devices by providing less effective results for restoring the lost functionalities of the dysfunctional hands. In this project, a novel technological solution and efficient hand assessment system is produced that can objectively measure the restoration outcome and, dynamically, evaluate its performance. The proposed system uses a data glove sensorial device to measure the multiple ranges of motion for the hand joints, and a Virtual Reality system to return an illustrative and safe visual assistance environment that can self-adjust with the subject’s performance. The system application implements an original finger performance measurement method for analysing the various hand functionalities. This is achieved by extracting the multiple features of the hand digits’ motions; such as speed, consistency of finger movements and stability during the hold positions. Furthermore, an advanced data glove calibration method was developed and implemented in order to accurately manipulate the virtual hand model and calculate the hand kinematic movements in compliance with the biomechanical structure of the hand. The experimental studies were performed on a controlled group of 10 healthy subjects (25 to 42 years age). The results showed intra-subject reliability between the trials (average of crosscorrelation ρ = 0.7), inter-subject repeatability across the subject’s performance (p > 0.01 for the session with real objects and with few departures in some of the virtual reality sessions). In addition, the finger performance values were found to be very efficient in detecting the multiple elements of the fingers’ performance including the load effect on the forearm. Moreover, the electromyography measurements, in the virtual reality sessions, showed high sensitivity in detecting the tremor effect (the mean power frequency difference on the right Vextensor digitorum muscle is 176 Hz). Also, the finger performance values for the virtual reality sessions have the same average distance as the real life sessions (RSQ =0.07). The system, besides offering an efficient and quantitative evaluation of hand performance, it was proven compatible with different hand rehabilitation techniques where it can outline the primarily affected parts in the hand dysfunction. It also can be easily adjusted to comply with the subject’s specifications and clinical hand assessment procedures to autonomously detect the classification task events and analyse them with high reliability. The developed system is also adaptable with different disciplines’ involvements, other than the hand rehabilitation, such as ergonomic studies, hand robot control, brain-computer interface and various fields involving hand control.Hand rehabilitation is an extremely complex and critical process in the medical rehabilitation field. This is mainly due to the high articulation of the hand functionality. Recent research has focused on employing new technologies, such as robotics and system control, in order to improve the precision and efficiency of the standard clinical methods used in hand rehabilitation. However, the designs of these devices were either oriented toward a particular hand injury or heavily dependent on subjective assessment techniques to evaluate the progress. These limitations reduce the efficiency of the hand rehabilitation devices by providing less effective results for restoring the lost functionalities of the dysfunctional hands. In this project, a novel technological solution and efficient hand assessment system is produced that can objectively measure the restoration outcome and, dynamically, evaluate its performance. The proposed system uses a data glove sensorial device to measure the multiple ranges of motion for the hand joints, and a Virtual Reality system to return an illustrative and safe visual assistance environment that can self-adjust with the subject’s performance. The system application implements an original finger performance measurement method for analysing the various hand functionalities. This is achieved by extracting the multiple features of the hand digits’ motions; such as speed, consistency of finger movements and stability during the hold positions. Furthermore, an advanced data glove calibration method was developed and implemented in order to accurately manipulate the virtual hand model and calculate the hand kinematic movements in compliance with the biomechanical structure of the hand. The experimental studies were performed on a controlled group of 10 healthy subjects (25 to 42 years age). The results showed intra-subject reliability between the trials (average of crosscorrelation ρ = 0.7), inter-subject repeatability across the subject’s performance (p > 0.01 for the session with real objects and with few departures in some of the virtual reality sessions). In addition, the finger performance values were found to be very efficient in detecting the multiple elements of the fingers’ performance including the load effect on the forearm. Moreover, the electromyography measurements, in the virtual reality sessions, showed high sensitivity in detecting the tremor effect (the mean power frequency difference on the right Vextensor digitorum muscle is 176 Hz). Also, the finger performance values for the virtual reality sessions have the same average distance as the real life sessions (RSQ =0.07). The system, besides offering an efficient and quantitative evaluation of hand performance, it was proven compatible with different hand rehabilitation techniques where it can outline the primarily affected parts in the hand dysfunction. It also can be easily adjusted to comply with the subject’s specifications and clinical hand assessment procedures to autonomously detect the classification task events and analyse them with high reliability. The developed system is also adaptable with different disciplines’ involvements, other than the hand rehabilitation, such as ergonomic studies, hand robot control, brain-computer interface and various fields involving hand control

    Kinematic Analysis of Multi-Fingered, Anthropomorphic Robotic Hands

    Get PDF
    The ability of stable grasping and fine manipulation with the multi-fingered robot hand with required precision and dexterity is playing an increasingly important role in the applications like service robots, rehabilitation, humanoid robots, entertainment robots, industries etc.. A number of multi-fingered robotic hands have been developed by various researchers in the past. The distinct advantages of a multi-fingered robot hand having structural similarity with human hand motivate the need for an anthropomorphic robot hand. Such a hand provides a promising base for supplanting human hand in execution of tedious, complicated and dangerous tasks, especially in situations such as manufacturing, space, undersea etc. These can also be used in orthopaedic rehabilitation of humans for improving the quality of the life of people having orthopedically and neurological disabilities. The developments so far are mostly driven by the application requirements. There are a number of bottlenecks with industrial grippers as regards to the stability of grasping objects of irregular geometries or complex manipulation operations. A multi-fingered robot hand can be made to mimic the movements of a human hand. The present piece of research work attempts to conceptualize and design a multi-fingered, anthropomorphic robot hand by structurally imitating the human hand. In the beginning, a brief idea about the history, types of robotic hands and application of multi-fingered hands in various fields are presented. A review of literature based on different aspects of the multi-fingered hand like structure, control, optimization, gasping etc. is made. Some of the important and more relevant literatures are elaborately discussed and a brief analysis is made on the outcomes and shortfalls with respect to multi-fingered hands. Based on the analysis of the review of literature, the research work aims at developing an improved anthropomorphic robot hand model in which apart from the four fingers and a thumb, the palm arch effect of human hand is also considered to increase its dexterity. A robotic hand with five anthropomorphic fingers including the thumb and palm arch effect having 25 degrees-of-freedom in all is investigated in the present work. Each individual finger is considered as an open loop kinematic chain and each finger segment is considered as a link of the manipulator. The wrist of the hand is considered as a fixed point. The kinematic analyses of the model for both forward kinematics and inverse kinematic are carried out. The trajectories of the tip positions of the thumb and the fingers with respect to local coordinate system are determined and plotted. This gives the extreme position of the fingertips which is obtained from the forward kinematic solution with the help of MATLAB. Similarly, varying all the joint iv angles of the thumb and fingers in their respective ranges, the reachable workspace of the hand model is obtained. Adaptive Neuro-Fuzzy Inference System (ANFIS) is used for solving the inverse kinematic problem of the fingers. Since the multi-fingered hand grasps the object mainly through its fingertips and the manipulation of the object is facilitated by the fingers due to their dexterity, the grasp is considered to be force-closure grasp. The grasping theory and different types of contacts between the fingertip and object are presented and the conditions for stable and equilibrium grasp are elaborately discussed. The proposed hand model is simulated to grasp five different shaped objects with equal base dimension and height. The forces applied on the fingertip during grasping are calculated. The hand model is also analysed using ANSYS to evaluate the stresses being developed at various points in the thumb and fingers. This analysis was made for the hand considering two different hand materials i.e. aluminium alloy and structural steel. The solution obtained from the forward kinematic analysis of the hand determines the maximum size for differently shaped objects while the solution to the inverse kinematic problem indicates the configurations of the thumb and the fingers inside the workspace of the hand. The solutions are predicted in which all joint angles are within their respective ranges. The results of the stress analysis of the hand model show that the structure of the fingers and the hand as a whole is capable of handling the selected objects. The robot hand under investigation can be realized and can be a very useful tool for many critical areas such as fine manipulation of objects, combating orthopaedic or neurological impediments, service robotics, entertainment robotics etc. The dissertation concludes with a summary of the contribution and the scope of further work
    corecore