165 research outputs found

    Formulation of a new gradient descent MARG orientation algorithm: case study on robot teleoperation

    Get PDF
    We introduce a novel magnetic angular rate gravity (MARG) sensor fusion algorithm for inertial measurement. The new algorithm improves the popular gradient descent (ʻMadgwick’) algorithm increasing accuracy and robustness while preserving computa- tional efficiency. Analytic and experimental results demonstrate faster convergence for multiple variations of the algorithm through changing magnetic inclination. Furthermore, decoupling of magnetic field variance from roll and pitch estimation is pro- ven for enhanced robustness. The algorithm is validated in a human-machine interface (HMI) case study. The case study involves hardware implementation for wearable robot teleoperation in both Virtual Reality (VR) and in real-time on a 14 degree-of-freedom (DoF) humanoid robot. The experiment fuses inertial (movement) and mechanomyography (MMG) muscle sensing to control robot arm movement and grasp simultaneously, demon- strating algorithm efficacy and capacity to interface with other physiological sensors. To our knowledge, this is the first such formulation and the first fusion of inertial measure- ment and MMG in HMI. We believe the new algorithm holds the potential to impact a very wide range of inertial measurement applications where full orientation necessary. Physiological sensor synthesis and hardware interface further provides a foundation for robotic teleoperation systems with necessary robustness for use in the field

    Review of Wearable Devices and Data Collection Considerations for Connected Health

    Get PDF
    Wearable sensor technology has gradually extended its usability into a wide range of well-known applications. Wearable sensors can typically assess and quantify the wearer’s physiology and are commonly employed for human activity detection and quantified self-assessment. Wearable sensors are increasingly utilised to monitor patient health, rapidly assist with disease diagnosis, and help predict and often improve patient outcomes. Clinicians use various self-report questionnaires and well-known tests to report patient symptoms and assess their functional ability. These assessments are time consuming and costly and depend on subjective patient recall. Moreover, measurements may not accurately demonstrate the patient’s functional ability whilst at home. Wearable sensors can be used to detect and quantify specific movements in different applications. The volume of data collected by wearable sensors during long-term assessment of ambulatory movement can become immense in tuple size. This paper discusses current techniques used to track and record various human body movements, as well as techniques used to measure activity and sleep from long-term data collected by wearable technology devices

    A fully connected deep learning approach to upper limb gesture recognition in a secure FES rehabilitation environment

    Get PDF
    Stroke is one of the leading causes of death and disability in the world. The rehabilitation of Patients' limb functions has great medical value, for example, the therapy of functional electrical stimulation (FES) systems, but suffers from effective rehabilitation evaluation. In this paper, six gestures of upper limb rehabilitation were monitored and collected using microelectromechanical systems sensors, where data stability was guaranteed using data preprocessing methods, that is, deweighting, interpolation, and feature extraction. A fully connected neural network has been proposed investigating the effects of different hidden layers, and determining its activation functions and optimizers. Experiments have depicted that a three‐hidden‐layer model with a softmax function and an adaptive gradient descent optimizer can reach an average gesture recognition rate of 97.19%. A stop mechanism has been used via recognition of dangerous gesture to ensure the safety of the system, and the lightweight cryptography has been used via hash to ensure the security of the system. Comparison to the classification models, for example, k‐nearest neighbor, logistic regression, and other random gradient descent algorithms, was conducted to verify the outperformance in recognition of upper limb gesture data. This study also provides an approach to creating health profiles based on large‐scale rehabilitation data and therefore consequent diagnosis of the effects of FES rehabilitation

    Gesture Based Character Recognition

    Get PDF
    Gesture is rudimentary movements of a human body part, which depicting the important movement of an individual. It is high significance for designing efficient human-computer interface. An proposed method for Recognition of character(English alphabets) from gesture i.e gesture is performed by the utilization of a pointer having color tip (is red, green, or blue). The color tip is segment from back ground by converting RGB to HSI color model. Motion of color tip is identified by optical flow method. During formation of multiple gesture the unwanted lines are removed by optical flow method. The movement of tip is recoded by Motion History Image(MHI) method. After getting the complete gesture, then each character is extracted from hand written image by using the connected component and the features are extracted of the correspond character. The recognition is performed by minimum distance classifier method (Modified Hausdorf Distance). An audio format of each character is store in data-set so that during the classification, the corresponding audio of character will play

    Biosignal‐based human–machine interfaces for assistance and rehabilitation : a survey

    Get PDF
    As a definition, Human–Machine Interface (HMI) enables a person to interact with a device. Starting from elementary equipment, the recent development of novel techniques and unobtrusive devices for biosignals monitoring paved the way for a new class of HMIs, which take such biosignals as inputs to control various applications. The current survey aims to review the large literature of the last two decades regarding biosignal‐based HMIs for assistance and rehabilitation to outline state‐of‐the‐art and identify emerging technologies and potential future research trends. PubMed and other databases were surveyed by using specific keywords. The found studies were further screened in three levels (title, abstract, full‐text), and eventually, 144 journal papers and 37 conference papers were included. Four macrocategories were considered to classify the different biosignals used for HMI control: biopotential, muscle mechanical motion, body motion, and their combinations (hybrid systems). The HMIs were also classified according to their target application by considering six categories: prosthetic control, robotic control, virtual reality control, gesture recognition, communication, and smart environment control. An ever‐growing number of publications has been observed over the last years. Most of the studies (about 67%) pertain to the assistive field, while 20% relate to rehabilitation and 13% to assistance and rehabilitation. A moderate increase can be observed in studies focusing on robotic control, prosthetic control, and gesture recognition in the last decade. In contrast, studies on the other targets experienced only a small increase. Biopotentials are no longer the leading control signals, and the use of muscle mechanical motion signals has experienced a considerable rise, especially in prosthetic control. Hybrid technologies are promising, as they could lead to higher performances. However, they also increase HMIs’ complex-ity, so their usefulness should be carefully evaluated for the specific application
    corecore