1,986 research outputs found
GUI system for Elders/Patients in Intensive Care
In the old age, few people need special care if they are suffering from
specific diseases as they can get stroke while they are in normal life routine.
Also patients of any age, who are not able to walk, need to be taken care of
personally but for this, either they have to be in hospital or someone like
nurse should be with them for better care. This is costly in terms of money and
man power. A person is needed for 24x7 care of these people. To help in this
aspect we purposes a vision based system which will take input from the patient
and will provide information to the specified person, who is currently may not
in the patient room. This will reduce the need of man power, also a continuous
monitoring would not be needed. The system is using MS Kinect for gesture
detection for better accuracy and this system can be installed at home or
hospital easily. The system provides GUI for simple usage and gives visual and
audio feedback to user. This system work on natural hand interaction and need
no training before using and also no need to wear any glove or color strip.Comment: In proceedings of the 4th IEEE International Conference on
International Technology Management Conference, Chicago, IL USA, 12-15 June,
201
Spatial Programming for Industrial Robots through Task Demonstration
We present an intuitive system for the programming of industrial robots using markerless gesture recognition and mobile augmented reality in terms of programming by demonstration. The approach covers gesture-based task definition and adaption by human demonstration, as well as task evaluation through augmented reality. A 3D motion tracking system and a handheld device establish the basis for the presented spatial programming system. In this publication, we present a prototype toward the programming of an assembly sequence consisting of several pick-and-place tasks. A scene reconstruction provides pose estimation of known objects with the help of the 2D camera of the handheld. Therefore, the programmer is able to define the program through natural bare-hand manipulation of these objects with the help of direct visual feedback in the augmented reality application. The program can be adapted by gestures and transmitted subsequently to an arbitrary industrial robot controller using a unified interface. Finally, we discuss an application of the presented spatial programming approach toward robot-based welding tasks
A vision-based approach for human hand tracking and gesture recognition.
Hand gesture interface has been becoming an active topic of human-computer interaction (HCI). The utilization of hand gestures in human-computer interface enables human operators to interact with computer environments in a natural and intuitive manner. In particular, bare hand interpretation technique frees users from cumbersome, but typically required devices in communication with computers, thus offering the ease and naturalness in HCI. Meanwhile, virtual assembly (VA) applies virtual reality (VR) techniques in mechanical assembly. It constructs computer tools to help product engineers planning, evaluating, optimizing, and verifying the assembly of mechanical systems without the need of physical objects. However, traditional devices such as keyboards and mice are no longer adequate due to their inefficiency in handling three-dimensional (3D) tasks. Special VR devices, such as data gloves, have been mandatory in VA. This thesis proposes a novel gesture-based interface for the application of VA. It develops a hybrid approach to incorporate an appearance-based hand localization technique with a skin tone filter in support of gesture recognition and hand tracking in the 3D space. With this interface, bare hands become a convenient substitution of special VR devices. Experiment results demonstrate the flexibility and robustness introduced by the proposed method to HCI.Dept. of Computer Science. Paper copy at Leddy Library: Theses & Major Papers - Basement, West Bldg. / Call Number: Thesis2004 .L8. Source: Masters Abstracts International, Volume: 43-03, page: 0883. Adviser: Xiaobu Yuan. Thesis (M.Sc.)--University of Windsor (Canada), 2004
The Future of Humanoid Robots
This book provides state of the art scientific and engineering research findings and developments in the field of humanoid robotics and its applications. It is expected that humanoids will change the way we interact with machines, and will have the ability to blend perfectly into an environment already designed for humans. The book contains chapters that aim to discover the future abilities of humanoid robots by presenting a variety of integrated research in various scientific and engineering fields, such as locomotion, perception, adaptive behavior, human-robot interaction, neuroscience and machine learning. The book is designed to be accessible and practical, with an emphasis on useful information to those working in the fields of robotics, cognitive science, artificial intelligence, computational methods and other fields of science directly or indirectly related to the development and usage of future humanoid robots. The editor of the book has extensive R&D experience, patents, and publications in the area of humanoid robotics, and his experience is reflected in editing the content of the book
Review on EMG Acquisition and Classification Techniques: Towards Zero Retraining in the Influence of User and Arm Position Independence
The surface electromyogram (EMG) is widely studied and applied in machine control. Recent methods of classifying hand gestures reported classification rates of over 95%. However, the majority of the studies made were performed on a single user, focusing solely on the gesture classification. These studies are restrictive in practical sense: either focusing on just gestures, multi-user compatibility, or rotation independence. The variations in EMG signals due to these conditions present a challenge to the practical application of EMG devices, often requiring repetitious training per application. To the best of our knowledge, there is little comprehensive review of works done in EMG classification in the combined influence of user-independence, rotation and hand exchange. Therefore, in this paper we present a review of works related to the practical issues of EMG with a focus on the EMG placement, and recent acquisition and computing techniques to reduce training. First, we provided an overview of existing electrode placement schemes. Secondly, we compared the techniques and results of single-subject against multi-subject, multi-position settings. As a conclusion, the study of EMG classification in this direction is relatively new. However the results are encouraging and strongly indicate that EMG classification in a broad range of people and tolerance towards arm orientation is possible, and can pave way for more flexible EMG devices
Real-Time Hand Gesture Recognition Using Temporal Muscle Activation Maps of Multi-Channel sEMG Signals
Accurate and real-time hand gesture recognition is essential for controlling
advanced hand prostheses. Surface Electromyography (sEMG) signals obtained from
the forearm are widely used for this purpose. Here, we introduce a novel hand
gesture representation called Temporal Muscle Activation (TMA) maps which
captures information about the activation patterns of muscles in the forearm.
Based on these maps, we propose an algorithm that can recognize hand gestures
in real-time using a Convolution Neural Network. The algorithm was tested on 8
healthy subjects with sEMG signals acquired from 8 electrodes placed along the
circumference of the forearm. The average classification accuracy of the
proposed method was 94%, which is comparable to state-of-the-art methods. The
average computation time of a prediction was 5.5ms, making the algorithm ideal
for the real-time gesture recognition applications.Comment: Paper accepted to IEEE International Conference on Acoustics, Speech,
and Signal Processing (ICASSP) 202
AN INVESTIGATION OF ELECTROMYOGRAPHIC (EMG) CONTROL OF DEXTROUS HAND PROSTHESES FOR TRANSRADIAL AMPUTEES
In reference to IEEE copyrighted material which is used with permission in this thesis, the IEEE does not endorse any of Plymouth University's products or services.There are many amputees around the world who have lost a limb through conflict, disease or an accident. Upper-limb prostheses controlled using surface Electromyography (sEMG) offer a solution to help the amputees; however, their functionality is limited by the small number of movements they can perform and their slow reaction times. Pattern recognition (PR)-based EMG control has been proposed to improve the functional performance of prostheses. It is a very promising approach, offering intuitive control, fast reaction times and the ability to control a large number of degrees of freedom (DOF). However, prostheses controlled with PR systems are not available for everyday use by amputees, because there are many major challenges and practical problems that need to be addressed before clinical implementation is possible. These include lack of individual finger control, an impractically large number of EMG electrodes, and the lack of deployment protocols for EMG electrodes site selection and movement optimisation. Moreover, the inability of PR systems to handle multiple forces is a further practical problem that needs to be addressed.
The main aim of this project is to investigate the research challenges mentioned above via non-invasive EMG signal acquisition, and to propose practical solutions to help amputees. In a series of experiments, the PR systems presented here were tested with EMG signals acquired from seven transradial amputees, which is unique to this project. Previous studies have been conducted using non-amputees.
In this work, the challenges described are addressed and a new protocol is proposed that delivers a fast clinical deployment of multi-functional upper limb prostheses controlled by PR systems. Controlling finger movement is a step towards the restoration of lost human capabilities, and is psychologically important, as well as physically.
A central thread running through this work is the assertion that no two amputees are the same, each suffering different injuries and retaining differing nerve and muscle structures. This work is very much about individualised healthcare, and aims to provide the best possible solution for each affected individual on a case-by-case basis. Therefore, the approach has been to optimise the solution (in terms of function and reliability) for each individual, as opposed to developing a generic solution, where performance is optimised against a test population.
This work is unique, in that it contributes to improving the quality of life for each individual amputee by optimising function and reliability. The main four contributions of the thesis are as follows:
1- Individual finger control was achieved with high accuracy for a large number of finger movements, using six optimally placed sEMG channels. This was validated on EMG signals for ten non-amputee and six amputee subjects. Thumb movements were classified successfully with high accuracy for the first time. The outcome of this investigation will help to add more movements to the prosthesis, and reduce hardware and computational complexity.
2- A new subject-specific protocol for sEMG site selection and reliable movement subset optimisation, based on the amputee’s needs, has been proposed and validated on seven amputees. This protocol will help clinicians to perform an efficient and fast deployment of prostheses, by finding the optimal number and locations of EMG channels. It will also find a reliable subset of movements that can be achieved with high performance.
3- The relationship between the force of contraction and the statistics of EMG signals has been investigated, utilising an experimental design where visual feedback from a Myoelectric Control Interface (MCI) helped the participants to produce the correct level of force. Kurtosis values were found to decrease monotonically when the contraction level increased, thus indicating that kurtosis can be used to distinguish different forces of contractions.
4- The real practical problem of the degradation of classification performance as a result of the variation of force levels during daily use of the prosthesis has been investigated, and solved by proposing a training approach and the use of a robust feature extraction method, based on the spectrum. The recommendations of this investigation improve the practical robustness of prostheses controlled with PR systems and progress a step further towards clinical implementation and improving the quality of life of amputees.
The project showed that PR systems achieved a reliable performance for a large number of amputees, taking into account real life issues such as individual finger control for high dexterity, the effect of force level variation, and optimisation of the movements and EMG channels for each individual amputee. The findings of this thesis showed that the PR systems need to be appropriately tuned before usage, such as training with multiple forces to help to reduce the effect of force variation, aiming to improve practical robustness, and also finding the optimal EMG channel for each amputee, to improve the PR system’s performance. The outcome of this research enables the implementation of PR systems in real prostheses that can be used by amputees.Ministry of Higher Education and Scientific Research and Baghdad University- Baghdad/Ira
Recommended from our members
Intuitive Human-Machine Interfaces for Non-Anthropomorphic Robotic Hands
As robots become more prevalent in our everyday lives, both in our workplaces and in our homes, it becomes increasingly likely that people who are not experts in robotics will be asked to interface with robotic devices. It is therefore important to develop robotic controls that are intuitive and easy for novices to use. Robotic hands, in particular, are very useful, but their high dimensionality makes creating intuitive human-machine interfaces for them complex. In this dissertation, we study the control of non-anthropomorphic robotic hands by non-roboticists in two contexts: collaborative manipulation and assistive robotics.
In the field of collaborative manipulation, the human and the robot work side by side as independent agents. Teleoperation allows the human to assist the robot when autonomous grasping is not able to deal sufficiently well with corner cases or cannot operate fast enough. Using the teleoperator’s hand as an input device can provide an intuitive control method, but finding a mapping between a human hand and a non-anthropomorphic robot hand can be difficult, due to the hands’ dissimilar kinematics. In this dissertation, we seek to create a mapping between the human hand and a fully actuated, non-anthropomorphic robot hand that is intuitive enough to enable effective real-time teleoperation, even for novice users.
We propose a low-dimensional and continuous teleoperation subspace which can be used as an intermediary for mapping between different hand pose spaces. We first propose the general concept of the subspace, its properties and the variables needed to map from the human hand to a robot hand. We then propose three ways to populate the teleoperation subspace mapping. Two of our mappings use a dataglove to harvest information about the user's hand. We define the mapping between joint space and teleoperation subspace with an empirical definition, which requires a person to define hand motions in an intuitive, hand-specific way, and with an algorithmic definition, which is kinematically independent, and uses objects to define the subspace. Our third mapping for the teleoperation subspace uses forearm electromyography (EMG) as a control input.
Assistive orthotics is another area of robotics where human-machine interfaces are critical, since, in this field, the robot is attached to the hand of the human user. In this case, the goal is for the robot to assist the human with movements they would not otherwise be able to achieve. Orthotics can improve the quality of life of people who do not have full use of their hands. Human-machine interfaces for assistive hand orthotics that use EMG signals from the affected forearm as input are intuitive and repeated use can strengthen the muscles of the user's affected arm. In this dissertation, we seek to create an EMG based control for an orthotic device used by people who have had a stroke. We would like our control to enable functional motions when used in conjunction with a orthosis and to be robust to changes in the input signal.
We propose a control for a wearable hand orthosis which uses an easy to don, commodity forearm EMG band. We develop an supervised algorithm to detect a user’s intent to open and close their hand, and pair this algorithm with a training protocol which makes our intent detection robust to changes in the input signal. We show that this algorithm, when used in conjunction with an orthosis over several weeks, can improve distal function in users. Additionally, we propose two semi-supervised intent detection algorithms designed to keep our control robust to changes in the input data while reducing the length and frequency of our training protocol
- …