2,940 research outputs found

    Evaluating rules of interaction for object manipulation in cluttered virtual environments

    Get PDF
    A set of rules is presented for the design of interfaces that allow virtual objects to be manipulated in 3D virtual environments (VEs). The rules differ from other interaction techniques because they focus on the problems of manipulating objects in cluttered spaces rather than open spaces. Two experiments are described that were used to evaluate the effect of different interaction rules on participants' performance when they performed a task known as "the piano mover's problem." This task involved participants in moving a virtual human through parts of a virtual building while simultaneously manipulating a large virtual object that was held in the virtual human's hands, resembling the simulation of manual materials handling in a VE for ergonomic design. Throughout, participants viewed the VE on a large monitor, using an "over-the-shoulder" perspective. In the most cluttered VEs, the time that participants took to complete the task varied by up to 76% with different combinations of rules, thus indicating the need for flexible forms of interaction in such environments

    Visuohaptic Simulation of a Borescope for Aircraft Engine Inspection

    Get PDF
    Consisting of a long, fiber optic probe containing a small CCD camera controlled by hand-held articulation interface, a video borescope is used for remote visual inspection of hard to reach components in an aircraft. The knowledge and psychomotor skills, specifically the hand-eye coordination, required for effective inspection are hard to acquire through limited exposure to the borescope in aviation maintenance schools. Inexperienced aircraft maintenance technicians gain proficiency through repeated hands-on learning in the workplace along a steep learning curve while transitioning from the classroom to the workforce. Using an iterative process combined with focused user evaluations, this dissertation details the design, implementation and evaluation of a novel visuohaptic simulator for training novice aircraft maintenance technicians in the task of engine inspection using a borescope. First, we describe the development of the visual components of the simulator, along with the acquisition and modeling of a representative model of a {PT-6} aircraft engine. Subjective assessments with both expert and novice aircraft maintenance engineers evaluated the visual realism and the control interfaces of the simulator. In addition to visual feedback, probe contact feedback is provided through a specially designed custom haptic interface that simulates tip contact forces as the virtual probe intersects with the {3D} model surfaces of the engine. Compared to other haptic interfaces, the custom design is unique in that it is inexpensive and uses a real borescope probe to simulate camera insertion and withdrawal. User evaluation of this simulator with probe tip feedback suggested a trend of improved performance with haptic feedback. Next, we describe the development of a physically-based camera model for improved behavioral realism of the simulator. Unlike a point-based camera, the enhanced camera model simulates the interaction of the borescope probe, including multiple points of contact along the length of the probe. We present visual comparisons of a real probe\u27s motion with the simulated probe model and develop a simple algorithm for computing the resultant contact forces. User evaluation comparing our custom haptic device with two commonly available haptic devices, the Phantom Omni and the Novint Falcon, suggests that the improved camera model as well as probe contact feedback with the 3D engine model plays a significant role in the overall engine inspection process. Finally, we present results from a skill transfer study comparing classroom-only instruction with both simulator and hands-on training. Students trained using the simulator and the video borescope completed engine inspection using the real video borescope significantly faster than students who received classroom-only training. The speed improvements can be attributed to reduced borescope probe maneuvering time within the engine and improved psychomotor skills due to training. Given the usual constraints of limited time and resources, simulator training may provide beneficial skills needed by novice aircraft maintenance technicians to augment classroom instruction, resulting in a faster transition into the aviation maintenance workforce

    Virtual Reality Games for Motor Rehabilitation

    Get PDF
    This paper presents a fuzzy logic based method to track user satisfaction without the need for devices to monitor users physiological conditions. User satisfaction is the key to any product’s acceptance; computer applications and video games provide a unique opportunity to provide a tailored environment for each user to better suit their needs. We have implemented a non-adaptive fuzzy logic model of emotion, based on the emotional component of the Fuzzy Logic Adaptive Model of Emotion (FLAME) proposed by El-Nasr, to estimate player emotion in UnrealTournament 2004. In this paper we describe the implementation of this system and present the results of one of several play tests. Our research contradicts the current literature that suggests physiological measurements are needed. We show that it is possible to use a software only method to estimate user emotion

    Multimodality with Eye tracking and Haptics: A New Horizon for Serious Games?

    Get PDF
    The goal of this review is to illustrate the emerging use of multimodal virtual reality that can benefit learning-based games. The review begins with an introduction to multimodal virtual reality in serious games and we provide a brief discussion of why cognitive processes involved in learning and training are enhanced under immersive virtual environments. We initially outline studies that have used eye tracking and haptic feedback independently in serious games, and then review some innovative applications that have already combined eye tracking and haptic devices in order to provide applicable multimodal frameworks for learning-based games. Finally, some general conclusions are identified and clarified in order to advance current understanding in multimodal serious game production as well as exploring possible areas for new applications

    Novel haptic interface For viewing 3D images

    Get PDF
    In recent years there has been an explosion of devices and systems capable of displaying stereoscopic 3D images. While these systems provide an improved experience over traditional bidimensional displays they often fall short on user immersion. Usually these systems only improve depth perception by relying on the stereopsis phenomenon. We propose a system that improves the user experience and immersion by having a position dependent rendering of the scene and the ability to touch the scene. This system uses depth maps to represent the geometry of the scene. Depth maps can be easily obtained on the rendering process or can be derived from the binocular-stereo images by calculating their horizontal disparity. This geometry is then used as an input to be rendered in a 3D display, do the haptic rendering calculations and have a position depending render of the scene. The author presents two main contributions. First, since the haptic devices have a finite work space and limited resolution, we used what we call detail mapping algorithms. These algorithms compress geometry information contained in a depth map, by reducing the contrast among pixels, in such a way that it can be rendered into a limited resolution display medium without losing any detail. Second, the unique combination of a depth camera as a motion capturing system, a 3D display and haptic device to enhance user experience. While developing this system we put special attention on the cost and availability of the hardware. We decided to use only off-the-shelf, mass consumer oriented hardware so our experiments can be easily implemented and replicated. As an additional benefit the total cost of the hardware did not exceed the one thousand dollars mark making it affordable for many individuals and institutions

    A Scoping Review on Virtual Reality-Based Industrial Training

    Get PDF
    The fourth industrial revolution has forced most companies to technologically evolve, applying new digital tools, so that their workers can have the necessary skills to face changing work environments. This article presents a scoping review of the literature on virtual reality-based training systems. The methodology consisted of four steps, which pose research questions, document search, paper selection, and data extraction. From a total of 350 peer-reviewed database articles, such as SpringerLink, IEEEXplore, MDPI, Scopus, and ACM, 44 were eventually chosen, mostly using the virtual reality haptic glasses and controls from Oculus Rift and HTC VIVE. It was concluded that, among the advantages of using this digital tool in the industry, is the commitment, speed, measurability, preservation of the integrity of the workers, customization, and cost reduction. Even though several research gaps were found, virtual reality is presented as a present and future alternative for the efficient training of human resources in the industrial field.This work was supported by Instituto Superior Tecnológico Victoria Vásconez Cuvi. The authors appreciate the opportunity to analyze topics related to this paper. The authors must also recognize the supported bringing by Universidad Tecnica de Ambato (UTA) and their Research and Development Department (DIDE) under project CONIN-P-256-2019, and SENESCYT by grants “Convocatoria Abierta 2011” and “Convocatoria Abierta 2013”
    corecore