537 research outputs found

    Advanced information processing of MEMS motion sensors for gesture interaction

    Get PDF

    MOCA: A Low-Power, Low-Cost Motion Capture System Based on Integrated Accelerometers

    Get PDF
    Human-computer interaction (HCI) and virtual reality applications pose the challenge of enabling real-time interfaces for natural interaction. Gesture recognition based on body-mounted accelerometers has been proposed as a viable solution to translate patterns of movements that are associated with user commands, thus substituting point-and-click methods or other cumbersome input devices. On the other hand, cost and power constraints make the implementation of a natural and efficient interface suitable for consumer applications a critical task. Even though several gesture recognition solutions exist, their use in HCI context has been poorly characterized. For this reason, in this paper, we consider a low-cost/low-power wearable motion tracking system based on integrated accelerometers called motion capture with accelerometers (MOCA) that we evaluated for navigation in virtual spaces. Recognition is based on a geometric algorithm that enables efficient and robust detection of rotational movements. Our objective is to demonstrate that such a low-cost and a low-power implementation is suitable for HCI applications. To this purpose, we characterized the system from both a quantitative point of view and a qualitative point of view. First, we performed static and dynamic assessment of movement recognition accuracy. Second, we evaluated the effectiveness of user experience using a 3D game application as a test bed

    Machine learning in 3D space gesture recognition

    Get PDF
    The rapid increase in the development of robotic systems in a controlled and uncontrolled environment leads to the development of a more natural interaction system. One such interaction is gesture recognition. The proposed paper is a simple approach towards gesture recognition technology where the hand movement in a 3-dimensional space is utilized to write the English alphabets and get the corresponding output in the screen or a display device. In order to perform the experiment, an MPU-6050 accelerometer, a microcontroller and a Bluetooth for wireless connection are used as the hardware components of the system. For each of the letters of the alphabets, the data instances are recorded in its raw form. 20 instances for each letter are recorded and it is then standardized using interpolation. The standardized data is fed as inputs to an SVM (Support Vector Machine) classifier to create a model. The created model is used for classification of future data instances at real time. Our method achieves a correct classification accuracy of 98.94% for the English alphabets’ hand gesture recognition. The primary objective of our approach is the development of a low-cost, low power and easily trained supervised gesture recognition system which identifies hand gesture movement efficiently and accurately. The experimental result obtained is based on use of a single subject

    Hand Motion Tracking System using Inertial Measurement Units and Infrared Cameras

    Get PDF
    This dissertation presents a novel approach to develop a system for real-time tracking of the position and orientation of the human hand in three-dimensional space, using MEMS inertial measurement units (IMUs) and infrared cameras. This research focuses on the study and implementation of an algorithm to correct the gyroscope drift, which is a major problem in orientation tracking using commercial-grade IMUs. An algorithm to improve the orientation estimation is proposed. It consists of: 1.) Prediction of the bias offset error while the sensor is static, 2.) Estimation of a quaternion orientation from the unbiased angular velocity, 3.) Correction of the orientation quaternion utilizing the gravity vector and the magnetic North vector, and 4.) Adaptive quaternion interpolation, which determines the final quaternion estimate based upon the current conditions of the sensor. The results verified that the implementation of the orientation correction algorithm using the gravity vector and the magnetic North vector is able to reduce the amount of drift in orientation tracking and is compatible with position tracking using infrared cameras for real-time human hand motion tracking. Thirty human subjects participated in an experiment to validate the performance of the hand motion tracking system. The statistical analysis shows that the error of position tracking is, on average, 1.7 cm in the x-axis, 1.0 cm in the y-axis, and 3.5 cm in the z-axis. The Kruskal-Wallis tests show that the orientation correction algorithm using gravity vector and magnetic North vector can significantly reduce the errors in orientation tracking in comparison to fixed offset compensation. Statistical analyses show that the orientation correction algorithm using gravity vector and magnetic North vector and the on-board Kalman-based orientation filtering produced orientation errors that were not significantly different in the Euler angles, Phi, Theta and Psi, with the p-values of 0.632, 0.262 and 0.728, respectively. The proposed orientation correction algorithm represents a contribution to the emerging approaches to obtain reliable orientation estimates from MEMS IMUs. The development of a hand motion tracking system using IMUs and infrared cameras in this dissertation enables future improvements in natural human-computer interactions within a 3D virtual environment

    Using Hidden Markov Models to Segment and Classify Wrist Motions Related to Eating Activities

    Get PDF
    Advances in body sensing and mobile health technology have created new opportunities for empowering people to take a more active role in managing their health. Measurements of dietary intake are commonly used for the study and treatment of obesity. However, the most widely used tools rely upon self-report and require considerable manual effort, leading to underreporting of consumption, non-compliance, and discontinued use over the long term. We are investigating the use of wrist-worn accelerometers and gyroscopes to automatically recognize eating gestures. In order to improve recognition accuracy, we studied the sequential ependency of actions during eating. In chapter 2 we first undertook the task of finding a set of wrist motion gestures which were small and descriptive enough to model the actions performed by an eater during consumption of a meal. We found a set of four actions: rest, utensiling, bite, and drink; any alternative gestures is referred as the other gesture. The stability of the definitions for gestures was evaluated using an inter-rater reliability test. Later, in chapter 3, 25 meals were hand labeled and used to study the existence of sequential dependence of the gestures. To study this, three types of classifiers were built: 1) a K-nearest neighbor classifier which uses no sequential context, 2) a hidden Markov model (HMM) which captures the sequential context of sub-gesture motions, and 3) HMMs that model inter-gesture sequential dependencies. We built first-order to sixth-order HMMs to evaluate the usefulness of increasing amounts of sequential dependence to aid recognition. The first two were our baseline algorithms. We found that the adding knowledge of the sequential dependence of gestures achieved an accuracy of 96.5%, which is an improvement of 20.7% and 12.2% over the KNN and sub-gesture HMM. Lastly, in chapter 4, we automatically segmented a continuous wrist motion signal and assessed its classification performance for each of the three classifiers. Again, the knowledge of sequential dependence enhances the recognition of gestures in unsegmented data, achieving 90% accuracy and improving 30.1% and 18.9% over the KNN and the sub-gesture HMM

    MEM Based Hand Gesture Controlled Wireless Robot

    Get PDF
    The robustness of MEMS based Gesture Controlled Robot is a kind of robot that can be by our hand gestures rather than an ordinary old switches or keypad. In Future there is a chance of making robots that can interact with humans in a natural manner. Hence our target interest is with hand motion-based gesture interfaces. An innovative Formula for gesture recognition is developed for identifying the distinct action signs made through hand movement. A MEMS Sensor was used to carry out this and also an Ultrasonic sensor for convinced operation. In order to full-fill our requirement a program has been written and executed using amicrocontroller system. Upon noticing the results of experimentation proves that our gesture formula is very competent and it’s also enhanced the natural way of intelligence and also assembled in a simple hardware circuit
    • …
    corecore