35 research outputs found

    On the development of a cybernetic prosthetic hand

    Get PDF
    The human hand is the end organ of the upper limb, which in humans serves the important function of prehension, as well as being an important organ for sensation and communication. It is a marvellous example of how a complex mechanism can be implemented, capable of realizing very complex and useful tasks using a very effective combination of mechanisms, sensing, actuation and control functions. In this thesis, the road towards the realization of a cybernetic hand has been presented. After a detailed analysis of the model, the human hand, a deep review of the state of the art of artificial hands has been carried out. In particular, the performance of prosthetic hands used in clinical practice has been compared with the research prototypes, both for prosthetic and for robotic applications. By following a biomechatronic approach, i.e. by comparing the characteristics of these hands with the natural model, the human hand, the limitations of current artificial devices will be put in evidence, thus outlining the design goals for a new cybernetic device. Three hand prototypes with a high number of degrees of freedom have been realized and tested: the first one uses microactuators embedded inside the structure of the fingers, and the second and third prototypes exploit the concept of microactuation in order to increase the dexterity of the hand while maintaining the simplicity for the control. In particular, a framework for the definition and realization of the closed-loop electromyographic control of these devices has been presented and implemented. The results were quite promising, putting in evidence that, in the future, there could be two different approaches for the realization of artificial devices. On one side there could be the EMG-controlled hands, with compliant fingers but only one active degree of freedom. On the other side, more performing artificial hands could be directly interfaced with the peripheral nervous system, thus establishing a bi-directional communication with the human brain

    Sensors for Robotic Hands: A Survey of State of the Art

    Get PDF
    Recent decades have seen significant progress in the field of artificial hands. Most of the surveys, which try to capture the latest developments in this field, focused on actuation and control systems of these devices. In this paper, our goal is to provide a comprehensive survey of the sensors for artificial hands. In order to present the evolution of the field, we cover five year periods starting at the turn of the millennium. At each period, we present the robot hands with a focus on their sensor systems dividing them into categories, such as prosthetics, research devices, and industrial end-effectors.We also cover the sensors developed for robot hand usage in each era. Finally, the period between 2010 and 2015 introduces the reader to the state of the art and also hints to the future directions in the sensor development for artificial hands

    Fused mechanomyography and inertial measurement for human-robot interface

    Get PDF
    Human-Machine Interfaces (HMI) are the technology through which we interact with the ever-increasing quantity of smart devices surrounding us. The fundamental goal of an HMI is to facilitate robot control through uniting a human operator as the supervisor with a machine as the task executor. Sensors, actuators, and onboard intelligence have not reached the point where robotic manipulators may function with complete autonomy and therefore some form of HMI is still necessary in unstructured environments. These may include environments where direct human action is undesirable or infeasible, and situations where a robot must assist and/or interface with people. Contemporary literature has introduced concepts such as body-worn mechanical devices, instrumented gloves, inertial or electromagnetic motion tracking sensors on the arms, head, or legs, electroencephalographic (EEG) brain activity sensors, electromyographic (EMG) muscular activity sensors and camera-based (vision) interfaces to recognize hand gestures and/or track arm motions for assessment of operator intent and generation of robotic control signals. While these developments offer a wealth of future potential their utility has been largely restricted to laboratory demonstrations in controlled environments due to issues such as lack of portability and robustness and an inability to extract operator intent for both arm and hand motion. Wearable physiological sensors hold particular promise for capture of human intent/command. EMG-based gesture recognition systems in particular have received significant attention in recent literature. As wearable pervasive devices, they offer benefits over camera or physical input systems in that they neither inhibit the user physically nor constrain the user to a location where the sensors are deployed. Despite these benefits, EMG alone has yet to demonstrate the capacity to recognize both gross movement (e.g. arm motion) and finer grasping (e.g. hand movement). As such, many researchers have proposed fusing muscle activity (EMG) and motion tracking e.g. (inertial measurement) to combine arm motion and grasp intent as HMI input for manipulator control. However, such work has arguably reached a plateau since EMG suffers from interference from environmental factors which cause signal degradation over time, demands an electrical connection with the skin, and has not demonstrated the capacity to function out of controlled environments for long periods of time. This thesis proposes a new form of gesture-based interface utilising a novel combination of inertial measurement units (IMUs) and mechanomyography sensors (MMGs). The modular system permits numerous configurations of IMU to derive body kinematics in real-time and uses this to convert arm movements into control signals. Additionally, bands containing six mechanomyography sensors were used to observe muscular contractions in the forearm which are generated using specific hand motions. This combination of continuous and discrete control signals allows a large variety of smart devices to be controlled. Several methods of pattern recognition were implemented to provide accurate decoding of the mechanomyographic information, including Linear Discriminant Analysis and Support Vector Machines. Based on these techniques, accuracies of 94.5% and 94.6% respectively were achieved for 12 gesture classification. In real-time tests, accuracies of 95.6% were achieved in 5 gesture classification. It has previously been noted that MMG sensors are susceptible to motion induced interference. The thesis also established that arm pose also changes the measured signal. This thesis introduces a new method of fusing of IMU and MMG to provide a classification that is robust to both of these sources of interference. Additionally, an improvement in orientation estimation, and a new orientation estimation algorithm are proposed. These improvements to the robustness of the system provide the first solution that is able to reliably track both motion and muscle activity for extended periods of time for HMI outside a clinical environment. Application in robot teleoperation in both real-world and virtual environments were explored. With multiple degrees of freedom, robot teleoperation provides an ideal test platform for HMI devices, since it requires a combination of continuous and discrete control signals. The field of prosthetics also represents a unique challenge for HMI applications. In an ideal situation, the sensor suite should be capable of detecting the muscular activity in the residual limb which is naturally indicative of intent to perform a specific hand pose and trigger this post in the prosthetic device. Dynamic environmental conditions within a socket such as skin impedance have delayed the translation of gesture control systems into prosthetic devices, however mechanomyography sensors are unaffected by such issues. There is huge potential for a system like this to be utilised as a controller as ubiquitous computing systems become more prevalent, and as the desire for a simple, universal interface increases. Such systems have the potential to impact significantly on the quality of life of prosthetic users and others.Open Acces

    The Development of a Brain Controlled Robotic Prosthetic Hand

    Get PDF
    An anthropomorphic, brain controlled, under actuated, Prosthetic hand has been designed and developed for upper extremity amputees. The hands function is based on micro servo actuation and the use of coupling links between parts of the finger. The control of a prosthetic hand is what differentiates this project from the others. It is the intent of this project to increase the sense of belonging between prosthesis and amputee by controlling the designed device by the brain of the amputee. The platform has been designed to use multiple force sensors to improve control. The project is a feasibility study and will be used to test whether a multi-functional and intuitive prosthetic hand is attainable. The control of the hand will be driven through a neural interface and controlled by a micro-board. This paper focuses on the mechanical design of the hand and the processes used to control the hand using signals emitted from the brain, to increase the sense of belonging between the amputee and prosthetic device. The hand has been developed as a foundation for future research into brain controlled prosthetics at the University of Waikato

    Study and development of sensorimotor interfaces for robotic human augmentation

    Get PDF
    This thesis presents my research contribution to robotics and haptics in the context of human augmentation. In particular, in this document, we are interested in bodily or sensorimotor augmentation, thus the augmentation of humans by supernumerary robotic limbs (SRL). The field of sensorimotor augmentation is new in robotics and thanks to the combination with neuroscience, great leaps forward have already been made in the past 10 years. All of the research work I produced during my Ph.D. focused on the development and study of fundamental technology for human augmentation by robotics: the sensorimotor interface. This new concept is born to indicate a wearable device which has two main purposes, the first is to extract the input generated by the movement of the user's body, and the second to provide the somatosensory system of the user with an haptic feedback. This thesis starts with an exploratory study of integration between robotic and haptic devices, intending to combine state-of-the-art devices. This allowed us to realize that we still need to understand how to improve the interface that will allow us to feel the agency when using an augmentative robot. At this point, the path of this thesis forks into two alternative ways that have been adopted to improve the interaction between the human and the robot. In this regard, the first path we presented tackles two aspects conerning the haptic feedback of sensorimotor interfaces, which are the choice of the positioning and the effectiveness of the discrete haptic feedback. In the second way we attempted to lighten a supernumerary finger, focusing on the agility of use and the lightness of the device. One of the main findings of this thesis is that haptic feedback is considered to be helpful by stroke patients, but this does not mitigate the fact that the cumbersomeness of the devices is a deterrent to their use. Preliminary results here presented show that both the path we chose to improve sensorimotor augmentation worked: the presence of the haptic feedback improves the performance of sensorimotor interfaces, the co-positioning of haptic feedback and the input taken from the human body can improve the effectiveness of these interfaces, and creating a lightweight version of a SRL is a viable solution for recovering the grasping function

    Pattern recognition-based real-time myoelectric control for anthropomorphic robotic systems : a thesis presented in partial fulfilment of the requirements for the degree of Doctor of Philosophy in Mechatronics at Massey University, Manawatū, New Zealand

    Get PDF
    All copyrighted Figures have been removed but may be accessed via their source cited in their respective captions.Advanced human-computer interaction (HCI) or human-machine interaction (HMI) aims to help humans interact with computers smartly. Biosignal-based technology is one of the most promising approaches in developing intelligent HCI systems. As a means of convenient and non-invasive biosignal-based intelligent control, myoelectric control identifies human movement intentions from electromyogram (EMG) signals recorded on muscles to realise intelligent control of robotic systems. Although the history of myoelectric control research has been more than half a century, commercial myoelectric-controlled devices are still mostly based on those early threshold-based methods. The emerging pattern recognition-based myoelectric control has remained an active research topic in laboratories because of insufficient reliability and robustness. This research focuses on pattern recognition-based myoelectric control. Up to now, most of effort in pattern recognition-based myoelectric control research has been invested in improving EMG pattern classification accuracy. However, high classification accuracy cannot directly lead to high controllability and usability for EMG-driven systems. This suggests that a complete system that is composed of relevant modules, including EMG acquisition, pattern recognition-based gesture discrimination, output equipment and its controller, is desirable and helpful as a developing and validating platform that is able to closely emulate real-world situations to promote research in myoelectric control. This research aims at investigating feasible and effective EMG signal processing and pattern recognition methods to extract useful information contained in EMG signals to establish an intelligent, compact and economical biosignal-based robotic control system. The research work includes in-depth study on existing pattern recognition-based methodologies, investigation on effective EMG signal capturing and data processing, EMG-based control system development, and anthropomorphic robotic hand design. The contributions of this research are mainly in following three aspects: Developed precision electronic surface EMG (sEMG) acquisition methods that are able to collect high quality sEMG signals. The first method was designed in a single-ended signalling manner by using monolithic instrumentation amplifiers to determine and evaluate the analog sEMG signal processing chain architecture and circuit parameters. This method was then evolved into a fully differential analog sEMG detection and collection method that uses common commercial electronic components to implement all analog sEMG amplification and filtering stages in a fully differential way. The proposed fully differential sEMG detection and collection method is capable of offering a higher signal-to-noise ratio in noisy environments than the single-ended method by making full use of inherent common-mode noise rejection capability of balanced signalling. To the best of my knowledge, the literature study has not found similar methods that implement the entire analog sEMG amplification and filtering chain in a fully differential way by using common commercial electronic components. Investigated and developed a reliable EMG pattern recognition-based real-time gesture discrimination approach. Necessary functional modules for real-time gesture discrimination were identified and implemented using appropriate algorithms. Special attention was paid to the investigation and comparison of representative features and classifiers for improving accuracy and robustness. A novel EMG feature set was proposed to improve the performance of EMG pattern recognition. Designed an anthropomorphic robotic hand construction methodology for myoelectric control validation on a physical platform similar to in real-world situations. The natural anatomical structure of the human hand was imitated to kinematically model the robotic hand. The proposed robotic hand is a highly underactuated mechanism, featuring 14 degrees of freedom and three degrees of actuation. This research carried out an in-depth investigation into EMG data acquisition and EMG signal pattern recognition. A series of experiments were conducted in EMG signal processing and system development. The final myoelectric-controlled robotic hand system and the system testing confirmed the effectiveness of the proposed methods for surface EMG acquisition and human hand gesture discrimination. To verify and demonstrate the proposed myoelectric control system, real-time tests were conducted onto the anthropomorphic prototype robotic hand. Currently, the system is able to identify five patterns in real time, including hand open, hand close, wrist flexion, wrist extension and the rest state. With more motion patterns added in, this system has the potential to identify more hand movements. The research has generated a few journal and international conference publications

    New relations in “post” era

    Get PDF
    As technology is developed, there become various changes in the world. And human is interacting very close to machines and technology more than any time. Philosophers have been criticising the modern concept of humanism to adjust to this new era. Artists are also one of the groups who react to it sensitively. In digital culture, humans become diverse forms and contexts, and this change demands the new definition of the human being. The existing boundaries of human and non-human such as, machine, nature are getting blurred and making new forms of relationships. These attempts can be found both in the words of philosophers and the practices of artists. In three chapters, the relationship between human and machine is discussed, based on various theoretical background and artistic practices. The first chapter contains two art projects of the author and deals with the new perspective of a machine and human subjectivity. In addition, chapter 2 of this thesis investigates how this new relationship affects the audiences and digital artworks in the museum. It focuses on relocated audiences who are positioned through media art. In the last chapter, by referencing non-human protagonists from pop culture who confront to the futuristic world, it explores the expecting problems and suggests to broaden the understanding of human and other species, to inhabit well in this “post” era

    ReHand - a portable assistive rehabilitation hand exoskeleton

    Get PDF
    This dissertation presents a synthesis of a novel underactuated exoskeleton (namely ReHand2) thought and designed for a task-oriented rehabilitation and/or for empower the human hand. The first part of this dissertation shows the current context about the robotic rehabilitation with a focus on hand pathologies, which influence the hand capability. The chapter is concluded with the presentation of ReHand2. The second chapter describes the human hand biomechanics. Starting from the definition of human hand anatomy, passing through anthropometric data, to taxonomy on hand grasps and finger constraints, both from static and dynamic point of view. In addition, some information about the hand capability are given. The third chapter analyze the current state of the art in hand exoskeleton for rehabilitation and empower tasks. In particular, the chapter presents exoskeleton technologies, from mechanisms to sensors, passing though transmission and actuators. Finally, the current state of the art in terms of prototype and commercial products is presented. The fourth chapter introduces the concepts of underactuation with the basic explanation and the classical notation used typically in the prosthetic field. In addition, the chapter describe also the most used differential elements in the prosthetic, follow by a statical analysis. Moreover typical transmission tree at inter-finger level as well as the intra- finger underactuation are explained . The fifth chapter presents the prototype called ReHand summarizing the device description and explanation of the working principle. It describes also the kinetostatic analysis for both, inter- and the intra-finger modules. in the last section preliminary results obtained with the exoskeleton are shown and discussed, attention is pointed out on prototype’s problems that have carry out at the second version of the device. The sixth chapter describes the evolution of ReHand, describing the kinematics and dynamics behaviors. In particular, for the mathematical description is introduced the notation used in order to analyze and optimize the geometry of the entire device. The introduced model is also implemented in Matlab Simulink environment. Finally, the chapter presents the new features. The seventh chapter describes the test bench and the methodologies used to evaluate the device statical, and dynamical performances. The chapter presents and discuss the experimental results and compare them with simulated one. Finally in the last chapter the conclusion about the ReHand project are proposed as well as the future development. In particular, the idea to test de device in relevant environments. In addition some preliminary considerations about the thumb and the wrist are introduced, exploiting the possibility to modify the entire layout of the device, for instance changing the actuator location

    Electromyography Based Human-Robot Interfaces for the Control of Artificial Hands and Wearable Devices

    Get PDF
    The design of robotic systems is currently facing human-inspired solutions as a road to replicate the human ability and flexibility in performing motor tasks. Especially for control and teleoperation purposes, the human-in-the-loop approach is a key element within the framework know as Human-Robot Interface. This thesis reports the research activity carried out for the design of Human-Robot Interfaces based on the detection of human motion intentions from surface electromyography. The main goal was to investigate intuitive and natural control solutions for the teleoperation of both robotic hands during grasping tasks and wearable devices during elbow assistive applications. The design solutions are based on the human motor control principles and surface electromyography interpretation, which are reviewed with emphasis on the concept of synergies. The electromyography based control strategies for the robotic hand grasping and the wearable device assistance are also reviewed. The contribution of this research for the control of artificial hands rely on the integration of different levels of the motor control synergistic organization, and on the combination of proportional control and machine learning approaches under the guideline of user-centred intuitiveness in the Human-Robot Interface design specifications. From the side of the wearable devices, the control of a novel upper limb assistive device based on the Twisted String Actuation concept is faced. The contribution regards the assistance of the elbow during load lifting tasks, exploring a simplification in the use of the surface electromyography within the design of the Human-Robot Interface. The aim is to work around complex subject-dependent algorithm calibrations required by joint torque estimation methods
    corecore