959 research outputs found
Designing 3D scenarios and interaction tasks for immersive environments
In the world of today, immersive reality such as virtual and mixed reality, is one of the
most attractive research fields. Virtual Reality, also called VR, has a huge potential
to be used in in scientific and educational domains by providing users with real-time
interaction or manipulation. The key concept in immersive technologies to provide a
high level of immersive sensation to the user, which is one of the main challenges in
this field. Wearable technologies play a key role to enhance the immersive sensation
and the degree of embodiment in virtual and mixed reality interaction tasks.
This project report presents an application study where the user interacts with
virtual objects, such as grabbing objects, open or close doors and drawers while wearing
a sensory cyberglove developed in our lab (Cyberglove-HT). Furthermore, it presents
the development of a methodology that provides inertial measurement unit(IMU)-based
gesture recognition.
The interaction tasks and 3D immersive scenarios were designed in Unity 3D.
Additionally, we developed an inertial sensor-based gesture recognition by employing
an Long short-term memory (LSTM) network. In order to distinguish the effect of
wearable technologies in the user experience in immersive environments, we made an
experimental study comparing the Cyberglove-HT to standard VR controllers (HTC
Vive Controller). The quantitive and subjective results indicate that we were able
to enhance the immersive sensation and self embodiment with the Cyberglove-HT. A
publication resulted from this work [1] which has been developed in the framework
of the R&D project Human Tracking and Perception in Dynamic Immersive Rooms
(HTPDI
Review of Wearable Devices and Data Collection Considerations for Connected Health
Wearable sensor technology has gradually extended its usability into a wide range of well-known applications. Wearable sensors can typically assess and quantify the wearer’s physiology and are commonly employed for human activity detection and quantified self-assessment. Wearable sensors are increasingly utilised to monitor patient health, rapidly assist with disease diagnosis, and help predict and often improve patient outcomes. Clinicians use various self-report questionnaires and well-known tests to report patient symptoms and assess their functional ability. These assessments are time consuming and costly and depend on subjective patient recall. Moreover, measurements may not accurately demonstrate the patient’s functional ability whilst at home. Wearable sensors can be used to detect and quantify specific movements in different applications. The volume of data collected by wearable sensors during long-term assessment of ambulatory movement can become immense in tuple size. This paper discusses current techniques used to track and record various human body movements, as well as techniques used to measure activity and sleep from long-term data collected by wearable technology devices
A Sign Language to Text Converter Using Leap Motion
This paper presents a prototype that can convert sign language into text. A Leap Motion controller was utilised as an interface for hand motion tracking without the need of wearing any external instruments. Three recognition techniques were employed to measure the performance of the prototype, namely the Geometric Template Matching, Artificial Neural Network and Cross Correlation. 26 alphabets from American Sign Language were chosen for training and testing the proposed prototype. The experimental results showed that Geometric Template Matching achieved the highest recognition accuracy compared to the other recognition techniques
Towards Developing an Effective Hand Gesture Recognition System for Human Computer Interaction: A Literature Survey
Gesture recognition is a mathematical analysis of movement of body parts (hand / face) done with the help of computing device. It helps computers to understand human body language and build a more powerful link between humans and machines. Many research works are developed in the field of hand gesture recognition. Each works have achieved different recognition accuracies with different hand gesture datasets, however most of the firms are having insufficient insight to develop necessary achievements to meet their development in real time datasets. Under such circumstances, it is very essential to have a complete knowledge of recognition methods of hand gesture recognition, its strength and weakness and the development criteria as well. Lots of reports declare its work to be better but a complete relative analysis is lacking in these works. In this paper, we provide a study of representative techniques for hand gesture recognition, recognition methods and also presented a brief introduction about hand gesture recognition. The main objective of this work is to highlight the position of various recognition techniqueswhich can indirectly help in developing new techniques for solving the issues in the hand gesture recognition systems. Moreover we present a concise description about the hand gesture recognition systems recognition methods and the instructions for future research
Dynamic Hand Gesture Recognition of Arabic Sign Language using Hand Motion Trajectory Features
In this paper we propose a system for dynamic hand gesture recognition of Arabic Sign Language The proposed system takes the dynamic gesture video stream as input extracts hand area and computes hand motion features then uses these features to recognize the gesture The system identifies the hand blob using YCbCr color space to detect skin color of hand The system classifies the input pattern based on correlation coefficients matching technique The significance of the system is its simplicity and ability to recognize the gestures independent of skin color and physical structure of the performers The experiment results show that the gesture recognition rate of 20 different signs performed by 8 different signers is 85 6
- …