44,559 research outputs found

    Exploitation of multiplayer interaction and development of virtual puppetry storytelling using gesture control and stereoscopic devices

    Get PDF
    With the rapid development of human-computer interaction technologies, the new media generation demands novel learning experiences with natural interaction and immersive experience. Considering that digital storytelling is a powerful pedagogical tool for young children, in this paper, we design an immersive storytelling environment that allows multiple players to use naturally interactive hand gestures to manipulate virtual puppetry for assisting narration. A set of multimodal interaction techniques is presented for a hybrid user interface that integrates existing 3D visualization and interaction devices including head-mounted displays and depth motion sensor. In this system, the young players could intuitively use hand gestures to manipulate virtual puppets to perform a story and interact with props in a virtual stereoscopic environment. We have conducted a user experiment with four young children for pedagogical evaluation, as well as system acceptability and interactivity evaluation by postgraduate students. The results show that our framework has great potential to stimulate learning abilities of young children through collaboration tasks. The stereoscopic head-mounted display outperformed the traditional monoscopic display in a comparison between the two

    Hand Gestures Recognition using Thermal Images

    Get PDF
    Master's thesis in Information- and communication technology (IKT590)Hand gesture recognition is important in a variety of applications, including medical systems and assistive technologies, human-computer interaction, human-robot interaction, industrial automation, virtual environment control, sign language translation, crisis and disaster management, en-tertainment and computer games, and robotics. RGB cameras are usually used for most of these applications. However, their performance is limited especially in low-light conditions. It is challenging to accurately classify the hand gestures in dark conditions. In this thesis, we propose the robust hand gestures recognition based on high resolution thermal imaging. These thermal images are captured using FLIR Lepton 3.5 thermal camera which is a high resolution thermal camera with a resolution of 160Ă—120 pixels. Thereafter, we feed the captured thermal images to a deep CNN model to accurately classify the hand gestures. We evaluate the performance of the proposed model with the benchmark models in terms of accuracy as well as the inference time when deployed on edge computing devices such as Raspberry Pi 4 Model B and NVIDIA JETSON AGX XAVIER

    Hand gesture-based interactive puppetry system to assist storytelling for children

    Get PDF
    © 2016 The Author(s)Digital techniques have been used to assist narrative and storytelling, especially in many pedagogical practices. With the rapid development of HCI techniques, saturated with digital media in their daily lives, young children, demands more interactive learning methods and meaningful immersive learning experiences. In this paper, we propose a novel hand gesture-based puppetry storytelling system which provides a more intuitive and natural human computer interaction method for young children to develop narrative ability in virtual story world. Depth motion sensing and hand gestures control technology is utilized in the implementation of user-friendly interaction. Young players could intuitively use hand gestures to manipulate virtual puppet to perform story and interact with different items in virtual environment to assist narration. Based on the result of the evaluation, this novel digital storytelling system shows positive pedagogical functions on children’s narrating ability as well as the competencies of cognitive and motor coordination. The usability of the system is preliminary examined in our test, and the results which showed that young children can benefit from playing with Puppet Narrator

    HAND GESTURE RECOGNITION SYSTEM UNDER COMPLEX BACKGROUND USING SPATIO-TEMPORAL ANALYSIS

    Get PDF
    The interaction between human and machine can be achieved by implementing human gesture applications which are known as human computer interaction (HCI). It has been developed several years ago, and now we can easily find it in many applications in real world e.g. in virtual reality, games, robotics, sign language recognition, etc. Gesture recognition has the potential to offer a natural way of communication between humans and machines. The most popular methods in human gesture application are hand gesture recognition. Several of hand gesture recognition methods still have the same problem when trying to recognize hand gesture accurately because of the existence of the uncontrolled environment problem, especially complex background. This study offered a method that can handle complex background in hand gesture recognition by analyzing pattern in spatio temporal domain. The system could reduce irregular noise in complex background significantly. The experiments were conducted based on our own recorded 180 videos with total 15 gestures. Experiment result showed that the proposed method could recognize 15 hand gestures with accuracy 98.33%

    Puppet Narrator: utilizing motion sensing technology in storytelling for young children

    Get PDF
    Using avatars in storytelling to assist narration has proved to be beneficial on promoting creativity, collaboration and intimacy among young children. Development of novel Human Computer Interaction (HCI) techniques provides us with new possibilities to explore the training aspects of storytelling by creating new ways of interaction. In this paper, we design and develop a novel digital puppetry storytelling system - Puppet Narrator for young children, utilizing depth motion sensing technology as the HCI method. More than merely allowing children to narrate orally, our system allows them to use hand gestures to play with a virtual puppet and manipulate it to interact with virtual items in virtual environment to assist narration. Under this novel pattern of interaction, children’s narrative ability can be trained and the competencies of cognition and motor coordination can also be nourished

    Development Prototype Design of Virtual Assembly Application-Based Leap Motion

    Full text link
    Innovation in design engineering practice is very important in the world of manufacturing in the increasingly competitive global market. Prototyping and evaluation measures are inseparable from the design process in the manufacture of a product. And made one of many physical prototypes require very expensive and time consuming, so the technology of Virtual Reality (VR) is needed, so the industry can quickly and precisely in the decision. VR technology combines a human being with a computer environment visually, touch and hearing, so that the user as if into the virtual world. The goal is that users with hand movements can interact with what is displayed on the computer screen or the user can interact with the environment is unreal to be added into the real world. VR is required for simulations that require a lot of interaction such as prototype assembly methods, or better known as the Virtual Assembly. Virtual Assembly concept which was developed as the ability to assemble a real representation of the physical model, the 3D models in CAD software by simulating the natural movement of the human hand. Leap Motion (accuracy of 0.01mm) was used to replace Microsoft's Kinect (accuracy of 1.5cm) and Motion Glove with flex sensors (accuracy of 1°) in several previous research. Leap mot ion controller is a device that captures every movement of the hand to then be processed and integrated with 3D models in CAD software. And simulation of assembly process virtually in CA D software with hand gestures detected by the leap mot ion, assembly parts can be driven either in translation or rotation, zooming and adding the assembly constraint. It also can perform mouse functions (such as left-click, middle-click, right-click and move the mouse cursor position) to a virtual assembly process simulation on CAD software

    American Sign Language Gesture Recognition using Motion Tracking Gloves in VR

    Get PDF
    Gesture recognition has become an topic of great interest as it continues to advance the capabilities of human computer interaction. Research has shown that related technologies have the potential to facilitate highly accessible user interfaces, enabling users with various limitations to use different applications in a more intuitive way. This thesis presents a new contribution to this research by introducing a novel approach to performing gesture recognition on American sign language (ASL) hand gestures through virtual reality (VR) using motion tracking gloves. As a proof of concept, an application was developed using this approach which is capable of recognizing 34 ASL hand gestures performed by a user as they navigate a turorial-based environment. This application was evaluated through a user study to determine the effectiveness of the approach and any possible improvements that could be made. The hope is that the approach presented in this thesis could be expanded into a number of different applications aimed at propagating the use of ASL and improving the lives of those who use it regularly

    Usability Studies In Virtual And Traditional Computer Aided Design Environments For Navigation

    Get PDF
    A usability study was used to measure user performance and user preferences for a CAVETM immersive stereoscopic virtual environment with wand interfaces compared directly with a workstation non-stereoscopic traditional CAD interface with keyboard and mouse. In both the CAVETM and the adaptable technology environments, crystal eye glasses are used to produce a stereoscopic view. An ascension flock of birds tracking system is used for tracking the user’s head and wand pointing device positions in 3D space. It is argued that with these immersive technologies, including the use of gestures and hand movements, a more natural interface in immersive virtual environments is possible. Such an interface allows a more rapid and efficient set of actions to recognize geometry, interaction within a spatial environment, the ability to find errors, and navigate through a virtual environment. The wand interface provides a significantly improved means of interaction. This study quantitatively measures the differences in interaction when compared with traditional human computer interfaces. This paper provides analysis via usability study methods for navigation termed as Benchmark 1. During testing, testers are given some time to “play around” with the CAVETM environment for familiarity before undertaking a specific exercise. The testers are then instructed regarding tasks to be completed, and are asked to work quickly without sacrificing accuracy. The research team timed each task, and recorded activity on evaluation sheets for Navigation Test. At the completion of the testing scenario involving navigation, the subject/testers were given a survey document and asked to respond by checking boxes to communicate their subjective opinions
    • …
    corecore