50 research outputs found

    Wearable Wristworn Gesture Recognition Using Echo State Network

    Get PDF
    This paper presents a novel gesture sensing system for prosthetic limb control based on a pressure sensor array embedded in a wristband. The tendon movement which produces pressure change around the wrist can be detected by pressure sensors. A microcontroller is used to gather the data from the sensors, followed by transmitting the data into a computer. A user interface is developed in LabVIEW, which presents the value of each sensor and display the waveform in real-time. Moreover, the data pattern of each gesture varies from different users due to the non-uniform subtle tendon movement. To overcome this challenge, Echo State Network (ESN), a supervised learning network, is applied to the data for calibrating different users. The results of gesture recognition show that the ESN has a good performance in multiple dimensional classifications. For experimental data collected from six participants, the proposed system classifies five gestures with an accuracy of 87.3%

    Exploring the Potential of Wrist-Worn Gesture Sensing

    Get PDF
    This thesis aims to explore the potential of wrist-worn gesture sensing. There has been a large amount of work on gesture recognition in the past utilizing different kinds of sensors. However, gesture sets tested across different work were all different, making it hard to compare them. Also, there has not been enough work on understanding what types of gestures are suitable for wrist-worn devices. Our work addresses these two problems and makes two main contributions compared to previous work: the specification of larger gesture sets, which were verified through an elicitation study generated by combining previous work; and an evaluation of the potential of gesture sensing with wrist-worn sensors. We developed a gesture recognition system, WristRec, which is a low-cost wrist-worn device utilizing bend sensors for gesture recognition. The design of WristRec aims to measure the tendon movement at the wrist while people perform gestures. We conducted a four-part study to verify the validity of the approach and the extent of gestures which can be detected using a wrist-worn system. During the initial stage, we verified the feasibility of WristRec using the Dynamic Time Warping (DTW) algorithm to perform gesture classification on a group of 5 gestures, the gesture set of the MYO armband. Next, we conducted an elicitation study to understand the trade-offs between hand, wrist, and arm gestures. The study helped us understand the type of gestures which wrist-worn system should be able to recognize. It also served as the base of our gesture set and our evaluation on the gesture sets used in the previous research. To evaluate the overall potential of wrist-worn recognition, we explored the design of hardware to recognize gestures by contrasting an Inertial measurement unit (IMU) only recognizer (the Serendipity system of Wen et al.) with our system. We assessed accuracies on a consensus gesture set and on a 27-gesture referent set, both extracted from the result of our elicitation study. Finally, we discuss the implications of our work both to the comparative evaluation of systems and to the design of enhanced hardware sensing

    Gesture Recognition Wristband Device with Optimised Piezoelectric Energy Harvesters

    Get PDF
    Wearable devices can be used for monitoring vital human physiological signs and for interacting with computers. Due to the limited lifetime of batteries, these devices require novel energy harvesting solutions to ensure uninterrupted and autonomous operation. We therefore developed a wearable wristband device with piezoelectric transducers, which were used for hybrid functionality. These transducers were used for both energy harvesting and sensing applications. In fact, we also demonstrate that gestures can be classified using electricity generated from these piezoelectric transducers as a result of tendon movements around the wrist. In this paper, we demonstrate how a multi-physics simulation model was used to maximize the amount of harvestable energy from these piezoelectric transducers

    Sign Language Glove

    Get PDF
    Communication between speakers and non-speakers of American Sign Language (ASL) can be problematic, inconvenient, and expensive. This project attempts to bridge the communication gap by designing a portable glove that captures the user’s ASL gestures and outputs the translated text on a smartphone. The glove is equipped with flex sensors, contact sensors, and a gyroscope to measure the flexion of the fingers, the contact between fingers, and the rotation of the hand. The glove’s Arduino UNO microcontroller analyzes the sensor readings to identify the gesture from a library of learned gestures. The Bluetooth module transmits the gesture to a smartphone. Using this device, one day speakers of ASL may be able to communicate with others in an affordable and convenient way

    Real-time human ambulation, activity, and physiological monitoring:taxonomy of issues, techniques, applications, challenges and limitations

    Get PDF
    Automated methods of real-time, unobtrusive, human ambulation, activity, and wellness monitoring and data analysis using various algorithmic techniques have been subjects of intense research. The general aim is to devise effective means of addressing the demands of assisted living, rehabilitation, and clinical observation and assessment through sensor-based monitoring. The research studies have resulted in a large amount of literature. This paper presents a holistic articulation of the research studies and offers comprehensive insights along four main axes: distribution of existing studies; monitoring device framework and sensor types; data collection, processing and analysis; and applications, limitations and challenges. The aim is to present a systematic and most complete study of literature in the area in order to identify research gaps and prioritize future research directions

    Sistema de miografia óptica para reconhecimento de gestos e posturas de mão

    Get PDF
    Orientador: Éric FujiwaraDissertação (mestrado) - Universidade Estadual de Campinas, Faculdade de Engenharia MecânicaResumo: Nesse projeto, demonstrou-se um sistema de miografia óptica como uma alternativa promissora para monitorar as posturas da mão e os gestos do usuário. Essa técnica se fundamenta em acompanhar as atividades musculares responsáveis pelos movimentos da mão com uma câmera externa, relacionando a distorção visual verificada no antebraço com a contração e o relaxamento necessários para dada postura. Três configurações de sensores foram propostas, estudadas e avaliadas. A primeira propôs monitorar a atividade muscular analisando a variação da frequência espacial de uma textura de listras uniformes impressa sobre a pele, enquanto que a segunda se caracteriza pela contagem de pixels de pele visível dentro da região de interesse. Ambas as configurações se mostraram inviáveis pela baixa robustez e alta demanda por condições experimentais controladas. Por fim, a terceira recupera o estado da mão acompanhando o deslocamento de uma série de marcadores coloridos distribuídos ao longo do antebraço. Com um webcam de 24 fps e 640 × 480 pixels, essa última configuração foi validada para oito posturas distintas, explorando principalmente a flexão e extensão dos dedos e do polegar, além da adução e abdução do último. Os dados experimentais, adquiridos off-line, são submetidos a uma rotina de processamento de imagens para extrair a informação espacial e de cor dos marcadores em cada quadro, dados esses utilizados para rastrear os mesmos marcadores ao longo de todos os quadros. Para reduzir a influência das vibrações naturais e inerentes ao corpo humano, um sistema de referencial local é ainda adotado dentro da própria região de interesse. Finalmente, os dados quadro a quadro com o ground truth são alimentados a uma rede neural artificial sequencial, responsável pela calibração supervisionada do sensor e posterior classificação das posturas. O desempenho do sistema para a classificação das oito posturas foi avaliado com base na validação cruzada com 10-folds, com a câmera monitorando o antebraço pela superfície interna ou externa. O sensor apresentou uma precisão de ?92.4% e exatidão de ?97.9% para o primeiro caso, e uma precisão de ?75.1% e exatidão de ?92.5% para o segundo, sendo comparável a outras técnicas de miografia, demonstrando a viabilidade do projeto e abrindo perspectivas para aplicações em interfaces humano-robôAbstract: In this work, an optical myography system is demonstrated as a promising alternative to monitor hand posture and gestures of the user. This technique is based on accompanying muscular activities responsible for hand motion with an external camera, and relating the visual deformation observed on the forearm to the muscular contractions/relaxations for a given posture. Three sensor designs were proposed, studied and evaluated. The first one intended to monitor muscular activity by analyzing the spatial frequency variation of a uniformly distributed stripe pattern stamped on the skin, whereas the second one is characterized by reckoning visible skin pixels inside the region of interest. Both designs are impracticable due to their low robustness and high demand for controlled experimental conditions. At last, the third design retrieves hand configuration by tracking visually the displacements of a series of color markers distributed over the forearm. With a webcam of 24 fps and 640 × 480 pixels, this design was validated for eight different postures, exploring fingers and thumb flexion/extension, plus thumb adduction/abduction. The experimental data are acquired offline and, then, submitted to an image processing routine to extract color and spatial information of the markers in each frame; the extracted data is subsequently used to track the same markers along all frames. To reduce the influence of human body natural and inherent vibrations, a local reference frame is yet adopted in the region of interest. Finally, the frame by frame data, along with the ground truth posture, are fed into a sequential artificial neural network, responsible for sensor supervised calibration and subsequent posture classification. The system performance was evaluated in terms of eight postures classification via 10-fold cross-validation, with the camera monitoring either the underside or the back of the forearm. The sensor presented a ?92.4% precision and ?97.9% accuracy for the former, and a ?75.1% precision and ?92.5% accuracy for the latter, being thus comparable to other myographic techniques; it also demonstrated that the project is feasible and offers prospects for human-robot interaction applicationsMestradoEngenharia MecanicaMestre em Engenharia Mecânica33003017CAPE

    Wearable pressure sensing for intelligent gesture recognition

    Get PDF
    The development of wearable sensors has become a major area of interest due to their wide range of promising applications, including health monitoring, human motion detection, human-machine interfaces, electronic skin and soft robotics. Particularly, pressure sensors have attracted considerable attention in wearable applications. However, traditional pressure sensing systems are using rigid sensors to detect the human motions. Lightweight and flexible pressure sensors are required to improve the comfortability of devices. Furthermore, in comparison with conventional sensing techniques without smart algorithm, machine learning-assisted wearable systems are capable of intelligently analysing data for classification or prediction purposes, making the system ‘smarter’ for more demanding tasks. Therefore, combining flexible pressure sensors and machine learning is a promising method to deal with human motion recognition. This thesis focuses on fabricating flexible pressure sensors and developing wearable applications to recognize human gestures. Firstly, a comprehensive literature review was conducted, including current state-of-the-art on pressure sensing techniques and machine learning algorithms. Secondly, a piezoelectric smart wristband was developed to distinguish finger typing movements. Three machine learning algorithms, K Nearest Neighbour (KNN), Decision Tree (DT) and Support Vector Machine (SVM), were used to classify the movement of different fingers. The SVM algorithm outperformed other classifiers with an overall accuracy of 98.67% and 100% when processing raw data and extracted features. Thirdly, a piezoresistive wristband was fabricated based on a flake-sphere composite configuration in which reduced graphene oxide fragments are doped with polystyrene spheres to achieve both high sensitivity and flexibility. The flexible wristband measured the pressure distribution around the wrist for accurate and comfortable hand gesture classification. The intelligent wristband was able to classify 12 hand gestures with 96.33% accuracy for five participants using a machine learning algorithm. Moreover, for demonstrating the practical applications of the proposed method, a realtime system was developed to control a robotic hand according to the classification results. Finally, this thesis also demonstrates an intelligent piezoresistive sensor to recognize different throat movements during pronunciation. The piezoresistive sensor was fabricated using two PolyDimethylsiloxane (PDMS) layers that were coated with silver nanowires and reduced graphene oxide films, where the microstructures were fabricated by the polystyrene spheres between the layers. The highly sensitive sensor was able to distinguish throat vibrations from five different spoken words with an accuracy of 96% using the artificial neural network algorithm
    corecore