388 research outputs found

    Towards Naturalistic Interfaces of Virtual Reality Systems

    Get PDF
    Interaction plays a key role in achieving realistic experience in virtual reality (VR). Its realization depends on interpreting the intents of human motions to give inputs to VR systems. Thus, understanding human motion from the computational perspective is essential to the design of naturalistic interfaces for VR. This dissertation studied three types of human motions, including locomotion (walking), head motion and hand motion in the context of VR. For locomotion, the dissertation presented a machine learning approach for developing a mechanical repositioning technique based on a 1-D treadmill for interacting with a unique new large-scale projective display, called the Wide-Field Immersive Stereoscopic Environment (WISE). The usability of the proposed approach was assessed through a novel user study that asked participants to pursue a rolling ball at variable speed in a virtual scene. In addition, the dissertation studied the role of stereopsis in avoiding virtual obstacles while walking by asking participants to step over obstacles and gaps under both stereoscopic and non-stereoscopic viewing conditions in VR experiments. In terms of head motion, the dissertation presented a head gesture interface for interaction in VR that recognizes real-time head gestures on head-mounted displays (HMDs) using Cascaded Hidden Markov Models. Two experiments were conducted to evaluate the proposed approach. The first assessed its offline classification performance while the second estimated the latency of the algorithm to recognize head gestures. The dissertation also conducted a user study that investigated the effects of visual and control latency on teleoperation of a quadcopter using head motion tracked by a head-mounted display. As part of the study, a method for objectively estimating the end-to-end latency in HMDs was presented. For hand motion, the dissertation presented an approach that recognizes dynamic hand gestures to implement a hand gesture interface for VR based on a static head gesture recognition algorithm. The proposed algorithm was evaluated offline in terms of its classification performance. A user study was conducted to compare the performance and the usability of the head gesture interface, the hand gesture interface and a conventional gamepad interface for answering Yes/No questions in VR. Overall, the dissertation has two main contributions towards the improvement of naturalism of interaction in VR systems. Firstly, the interaction techniques presented in the dissertation can be directly integrated into existing VR systems offering more choices for interaction to end users of VR technology. Secondly, the results of the user studies of the presented VR interfaces in the dissertation also serve as guidelines to VR researchers and engineers for designing future VR systems

    Real-Time Head Gesture Recognition on Head-Mounted Displays using Cascaded Hidden Markov Models

    Full text link
    Head gesture is a natural means of face-to-face communication between people but the recognition of head gestures in the context of virtual reality and use of head gesture as an interface for interacting with virtual avatars and virtual environments have been rarely investigated. In the current study, we present an approach for real-time head gesture recognition on head-mounted displays using Cascaded Hidden Markov Models. We conducted two experiments to evaluate our proposed approach. In experiment 1, we trained the Cascaded Hidden Markov Models and assessed the offline classification performance using collected head motion data. In experiment 2, we characterized the real-time performance of the approach by estimating the latency to recognize a head gesture with recorded real-time classification data. Our results show that the proposed approach is effective in recognizing head gestures. The method can be integrated into a virtual reality system as a head gesture interface for interacting with virtual worlds

    Towards Omnidirectional Immersion for ROV Teleoperation

    Get PDF
    [Abstract] The use of omnidirectional cameras underwater is enabling many new and exciting applications in multiple fields. Among them, it will allow Remotely Operated Underwater Vehicles (ROVs) to be piloted directly by means of the images captured by omnidirectional cameras through virtual reality (VR) headsets. This immersive experience will extend the pilot’s spatial awareness and reduce the usual orientation problems during missions. This paper presents this concept and illustrates it with the first experiments for achieving this purpose.This research was supported by the Spanish National Projects ARCHROV (Marine ARChaeology through HROV/AUV cooperation) under the agreement DPI2014-57746-C3-3-R and OMNIUS under the agreement CTM2013-46718-R, the Generalitat de Catalunya through the ACCIO/TecnioSpring program (TECSPR14-1-0050) (to N. Gracias), and "la Secretaria d'Universitats i Recerca del Departament d'Economia i Coneixement de la Generalitat de Catalunya" (to J. Bosch)Generalitat de Catalunya; TECSPR14-1-0050https://doi.org/10.17979/spudc.978849749808

    Oculus Rift Application for Training Drone Pilots

    Get PDF
    The research described in this paper, focuses on a virtual reality headset system that integrates the Oculus Rift VR headset with a low cost Unmanned Aerial Vehicle (UAV) to allow for drone teleoperation and telepresence using the Robot Operating System (ROS). We developed a system that allows the pilot to fly an AR Drone through natural head movements translated to a set of flight commands. The system is designed to be easy to use for the purposes of training drone pilots. The user simply has to move their head and these movements are translated to the quadrotor which then turns in that direction. Altitude control is implemented using a Wii Nunchuck joystick for altitude adjustment. The users use the Oculus Rift headset a 2D video stream from the AR Drone, which is then turned into a 3D image stream and presented to them on the headset

    Design and evaluation of a natural interface for remote operation of underwater roter

    Get PDF
    Nowadays, an increasing need of intervention robotic systems can be observed in all kind of hazardous environments. In all these intervention systems, the human expert continues playing a central role from the decision making point of view. For instance, in underwater domains, when manipulation capabilities are required, only Remote Operated Vehicles, commercially available, can be used, normally using master-slave architectures and relaying all the responsibility in the pilot. Thus, the role played by human- machine interfaces represents a crucial point in current intervention systems. This paper presents a User Interface Abstraction Layer and introduces a new procedure to control an underwater robot vehicle by using a new intuitive and immersive interface, which will show to the user only the most relevant information about the current mission. Finally, some experiments have been carried out to compare a traditional setup and the new procedure, demonstrating reliability and feasibility of our approach.This research was partly supported by Spanish Ministry of Research and Innovation DPI2011-27977-C03 (TRITON Project)

    A natural interface for remote operation of underwater robots

    Get PDF
    Nowadays, an increasing need of intervention robotic systems can be observed in all kind of hazardous environments. In all these intervention systems, the human expert continues playing a central role from the decision-making point of view. For instance, in underwater domains, when manipulation capabilities are required, only Remote Operated Vehicles, commercially available, can be used, normally using master-slave architectures and relaying all the responsibility in the pilot. Thus, the role played by human- machine interfaces represents a crucial point in current intervention systems. This paper presents a User Interface Abstraction Layer and introduces a new procedure to control an underwater robot vehicle by using a new intuitive and immersive interface, which will show to the user only the most relevant information about the current mission. We conducted an experiment and found that the highest user preference and performance was in the immersive condition with joystick navigation.This research was partly supported by Spanish Ministry of Research and Innovation DPI2011-27977-C03 (TRITON Project)
    • …
    corecore