499 research outputs found

    Designing 3D scenarios and interaction tasks for immersive environments

    Get PDF
    In the world of today, immersive reality such as virtual and mixed reality, is one of the most attractive research fields. Virtual Reality, also called VR, has a huge potential to be used in in scientific and educational domains by providing users with real-time interaction or manipulation. The key concept in immersive technologies to provide a high level of immersive sensation to the user, which is one of the main challenges in this field. Wearable technologies play a key role to enhance the immersive sensation and the degree of embodiment in virtual and mixed reality interaction tasks. This project report presents an application study where the user interacts with virtual objects, such as grabbing objects, open or close doors and drawers while wearing a sensory cyberglove developed in our lab (Cyberglove-HT). Furthermore, it presents the development of a methodology that provides inertial measurement unit(IMU)-based gesture recognition. The interaction tasks and 3D immersive scenarios were designed in Unity 3D. Additionally, we developed an inertial sensor-based gesture recognition by employing an Long short-term memory (LSTM) network. In order to distinguish the effect of wearable technologies in the user experience in immersive environments, we made an experimental study comparing the Cyberglove-HT to standard VR controllers (HTC Vive Controller). The quantitive and subjective results indicate that we were able to enhance the immersive sensation and self embodiment with the Cyberglove-HT. A publication resulted from this work [1] which has been developed in the framework of the R&D project Human Tracking and Perception in Dynamic Immersive Rooms (HTPDI

    VR-CHEM Developing a virtual reality interface for molecular modelling

    Get PDF
    VR-CHEM is a prototype for a virtual reality molecular modelling program with a modern 3D user interface. In this thesis, the author discusses the research behind the development of the prototype, provides a detailed description of the program and its features, and reports on the user tests. The research includes reviewing previous programs of a similar category that have appeared in studies in the literature. Some of these are related to chemistry and molecular modelling while others focus on 3D input techniques. Consequently, the prototype contributes by exploring the design of the user interface and how it can affect productivity in this category of programs. The prototype is subjected to a pilot user test to evaluate what further developments are required. Based on this, the thesis proposes that 3D interfaces, while capable of several unique tasks, are yet to overcome some significant drawbacks such as limitations in accuracy and precision. It also suggests that virtual reality can aid in spatial understanding but virtual hands and controllers are far inferior to real hands for even basic tasks due to a lack of tactile feedback

    Exploring Gesture Recognition in the Virtual Reality Space

    Get PDF
    This thesis presents two novel modifications to a gesture recognition systemfor virtual reality devices and applications. In doing this it evaluates usersmovements in VR when presented with gestures and uses this information todevelop a continuous tracking system that can detect the start and end of gestures.It also expands on previous work with gestures in games with an implementationof an adaptive database system that has been seen to improve accuracy rates.The database allows users to immediately start using the system with no priortraining and will improve accuracy rates as they spend more time in the game.Furthermore it evaluates both the explicit and continuous recognition systemsthrough user based studies. The results from these studies show promise for theusability of gesture based interaction systems for VR devices in the future. Theyalso provide findings that suggest that for the use case of games continuous systemcould be too cumbersome for users

    World Wizards: Developing a VR World Building Application

    Get PDF
    World Wizards is an open source and extendable world building environment that enables non-technical users to create 3D worlds in virtual reality and can be used for research, education, and product development purposes. It was developed for the HTC Vive using the Unity game engine. World Wizards embraces user-generated content, allowing users to build their own environments within VR and providing utilities to aid users in creating, distributing, and importing their own custom assets

    American Sign Language Gesture Recognition using Motion Tracking Gloves in VR

    Get PDF
    Gesture recognition has become an topic of great interest as it continues to advance the capabilities of human computer interaction. Research has shown that related technologies have the potential to facilitate highly accessible user interfaces, enabling users with various limitations to use different applications in a more intuitive way. This thesis presents a new contribution to this research by introducing a novel approach to performing gesture recognition on American sign language (ASL) hand gestures through virtual reality (VR) using motion tracking gloves. As a proof of concept, an application was developed using this approach which is capable of recognizing 34 ASL hand gestures performed by a user as they navigate a turorial-based environment. This application was evaluated through a user study to determine the effectiveness of the approach and any possible improvements that could be made. The hope is that the approach presented in this thesis could be expanded into a number of different applications aimed at propagating the use of ASL and improving the lives of those who use it regularly

    A multiple optical tracking based approach for enhancing hand-based interaction in virtual reality simulations

    Get PDF
    A thesis submitted in partial fulfilment of the requirements of the University of Wolverhampton for the degree of Doctor of Philosophy.Research exploring natural virtual reality interaction has seen significant success in optical tracker-based approaches, enabling users to freely interact using their hands. Optical based trackers can provide users with real-time, high-fidelity virtual hand representations for natural interaction and an immersive experience. However, work in this area has identified four issues: occlusion, field-of-view, stability and accuracy. To overcome the four key issues, researchers have investigated approaches such as using multiple sensors. Research has shown multi-sensor-based approaches to be effective in improving recognition accuracy. However, such approaches typically use statically positioned sensors, which introduce body occlusion issues that make tracking hands challenging. Machine learning approaches have also been explored to improve gesture recognition. However, such approaches typically require a pre-set gesture vocabulary limiting user actions with larger vocabularies hindering real-time performance. This thesis presents an optical hand-based interaction system that comprises two Leap Motion sensors mounted onto a VR headset at different orientations. Novel approaches to the aggregation and validation of sensor data are presented. A machine learning sub-system is developed to validate hand data received by the sensors. Occlusion detection, stability detection, inferred hands and a hand interpolation sub-system are also developed to ensure that valid hand representations are always shown to the user. In addition, a mesh conformation sub-system ensures 3D objects are appropriately held in a user’s virtual hand. The presented system addresses the four key issues of optical sessions to provide a smooth and consistent user experience. The MOT system is evaluated against traditional interaction approaches; gloves, motion controllers and a single front-facing sensor configuration. The comparative sensor evaluation analysed the validity and availability of tracking data, along with each sensors effect on the MOT system. The results show the MOT provides a more stable experience than the front-facing configuration and produces significantly more valid tracking data. The results also demonstrated the effectiveness of a 45-degree sensor configuration in comparison to a front-facing. Furthermore, the results demonstrated the effectiveness of the MOT systems solutions at handling the four key issues with optical trackers

    Establishing a Framework for the development of Multimodal Virtual Reality Interfaces with Applicability in Education and Clinical Practice

    Get PDF
    The development of Virtual Reality (VR) and Augmented Reality (AR) content with multiple sources of both input and output has led to countless contributions in a great many number of fields, among which medicine and education. Nevertheless, the actual process of integrating the existing VR/AR media and subsequently setting it to purpose is yet a highly scattered and esoteric undertaking. Moreover, seldom do the architectures that derive from such ventures comprise haptic feedback in their implementation, which in turn deprives users from relying on one of the paramount aspects of human interaction, their sense of touch. Determined to circumvent these issues, the present dissertation proposes a centralized albeit modularized framework that thus enables the conception of multimodal VR/AR applications in a novel and straightforward manner. In order to accomplish this, the aforesaid framework makes use of a stereoscopic VR Head Mounted Display (HMD) from Oculus Rift©, a hand tracking controller from Leap Motion©, a custom-made VR mount that allows for the assemblage of the two preceding peripherals and a wearable device of our own design. The latter is a glove that encompasses two core modules in its innings, one that is able to convey haptic feedback to its wearer and another that deals with the non-intrusive acquisition, processing and registering of his/her Electrocardiogram (ECG), Electromyogram (EMG) and Electrodermal Activity (EDA). The software elements of the aforementioned features were all interfaced through Unity3D©, a powerful game engine whose popularity in academic and scientific endeavors is evermore increasing. Upon completion of our system, it was time to substantiate our initial claim with thoroughly developed experiences that would attest to its worth. With this premise in mind, we devised a comprehensive repository of interfaces, amid which three merit special consideration: Brain Connectivity Leap (BCL), Ode to Passive Haptic Learning (PHL) and a Surgical Simulator

    Trade-Off between Task Accuracy, Task Completion Time and Naturalness for Direct Object Manipulation in Virtual Reality

    Get PDF
    Virtual reality devices are used for several application domains, such as medicine, entertainment, marketing and training. A handheld controller is the common interaction method for direct object manipulation in virtual reality environments. Using hands would be a straightforward way to directly manipulate objects in the virtual environment if hand-tracking technology were reliable enough. In recent comparison studies, hand-based systems compared unfavorably against the handheld controllers in task completion times and accuracy. In our controlled study, we com-pare these two interaction techniques with a new hybrid interaction technique which combines the controller tracking with hand gestures for a rigid object manipulation task. The results demonstrate that the hybrid interaction technique is the most preferred because it is intuitive, easy to use, fast, reliable and it provides haptic feedback resembling the real-world object grab. This suggests that there is a trade-off between naturalness, task accuracy and task completion time when using these direct manipulation interaction techniques, and participants prefer to use interaction techniques that provide a balance between these three factors.publishedVersionPeer reviewe
    • …
    corecore