9 research outputs found

    It\u27s Tool Time: Exploring Tool Design Alternatives for Virtual Reality Trainings

    Get PDF
    Virtual reality (VR) technologies have gained a steady increase in attention and use in organizations across various industries in recent years. A useful application scenario is VR training, enabling employees to immersively and interactively familiarize with or practice work processes in a safe space without the risk of physical harm or financial consequences for the organization. This research explores how tool representation alternatives in virtual reality training scenarios (VRTS) affect user experience and content transfer. In a two-stage research approach, a total of 20 participants are randomly assigned to one of two VRTS with different tool representation types and interviewed subsequently. The findings indicate that decisions regarding tool representation in VRTS should be based on tool-independent (e.g., the feeling of tool operation) and tool-dependent factors (e.g., tool complexity)

    Virtual reality for assembly methods prototyping: a review

    Get PDF
    Assembly planning and evaluation is an important component of the product design process in which details about how parts of a new product will be put together are formalized. A well designed assembly process should take into account various factors such as optimum assembly time and sequence, tooling and fixture requirements, ergonomics, operator safety, and accessibility, among others. Existing computer-based tools to support virtual assembly either concentrate solely on representation of the geometry of parts and fixtures and evaluation of clearances and tolerances or use simulated human mannequins to approximate human interaction in the assembly process. Virtual reality technology has the potential to support integration of natural human motions into the computer aided assembly planning environment (Ritchie et al. in Proc I MECH E Part B J Eng 213(5):461–474, 1999). This would allow evaluations of an assembler’s ability to manipulate and assemble parts and result in reduced time and cost for product design. This paper provides a review of the research in virtual assembly and categorizes the different approaches. Finally, critical requirements and directions for future research are presented

    The use of visual and auditory feedback for assembly task performance in a virtual environment

    Get PDF
    This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author's copyright. In most cases, these works may not be reposted without the explicit permission of the copyright holder.---- Copyright IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE

    Combining physical constraints with geometric constraint-based modeling for virtual assembly

    Get PDF
    The research presented in this dissertation aims to create a virtual assembly environment capable of simulating the constant and subtle interactions (hand-part, part-part) that occur during manual assembly, and providing appropriate feedback to the user in real-time. A virtual assembly system called SHARP System for Haptic Assembly and Realistic Prototyping is created, which utilizes simulated physical constraints for part placement during assembly.;The first approach taken in this research attempt utilized Voxmap Point Shell (VPS) software for implementing collision detection and physics-based modeling in SHARP. A volumetric approach, where complex CAD models were represented by numerous small cubic-voxel elements was used to obtain fast physics update rates (500--1000 Hz). A novel dual-handed haptic interface was developed and integrated into the system allowing the user to simultaneously manipulate parts with both hands. However, coarse model approximations used for collision detection and physics-based modeling only allowed assembly when minimum clearance was limited to ∼8-10%.;To provide a solution to the low clearance assembly problem, the second effort focused on importing accurate parametric CAD data (B-Rep) models into SHARP. These accurate B-Rep representations are used for collision detection as well as for simulating physical contacts more accurately. A new hybrid approach is presented, which combines the simulated physical constraints with geometric constraints which can be defined at runtime. Different case studies are used to identify the suitable combination of methods (collision detection, physical constraints, geometric constraints) capable of best simulating intricate interactions and environment behavior during manual assembly. An innovative automatic constraint recognition algorithm is created and integrated into SHARP. The feature-based approach utilized for the algorithm design, facilitates faster identification of potential geometric constraints that need to be defined. This approach results in optimized system performance while providing a more natural user experience for assembly

    Multimodal feedback cues on manual lifting in virtual environments

    Get PDF
    Improper manipulation of real-world objects increases the risk of developing work- related back injuries. In an effort to reduce such a risk and encourage appropriate lifting and moving methods, a Virtual Environment (VE) was employed. Virtual simulations can be used for ergonomic analysis. In this work, the VEs made use of multiple feedback techniques to allow a person to estimate the forces acting on their lower back. A person's head and hand movements were tracked in real-time whilst manipulating an object. A NIOSH lifting equation was used to calculate and determine the Lifting Index whereby the results were conveyed in real time. Visual display feedback techniques were designed and the effect of cues to enhance user performance was experimentally evaluated. The feedback cues provide the user with information about the forces acting on their lower back as they perform manual lifting tasks in VEs. Four different methods were compared and contrasted: No Feedback, Text, Colour and Combined Colour and Text. This work also investigated various types of auditory feedback technique to support object manipulation in VEs. Auditory feedback has been demonstrated to convey information in computer applications effectively, but little work has been reported on the efficacy of such techniques, particularly for ergonomic design. Four different methods were compared and contrasted: No Feedback, White-noise, Pitch and Tempo. A combined Audio-Visual (AV) technique was also examined by mixing both senses. The effect of Tactile Augmentation was also examined. Three different weights (real) were used and the results obtained by experiment were compared with the experiment using virtual weights in order to evaluate whether or not the presence of a real weighted object enhanced people's sense of realism. The goals of this study were to explore various senses of feedback technique (visual, auditory and tactile), compare the performance characteristics of each technique and understand their relative advantages and drawbacks.EThOS - Electronic Theses Online ServiceGBUnited Kingdo

    The use of visual and auditory feedback for assembly task performance in a virtual environment

    No full text
    This paper presents our creation and evaluation of multi-modal interface for a virtual assembly environment (VAE). It involves implementing an assembly simulation environment with multisensory feedback (visual and auditory), and evaluating the effects of multi-modal feedback on assembly task performance. This virtual environment experimental platform brought together complex technologies such as constraint-based assembly simulation, optical motion tracking technology, and real-time 3D sound generation technology around a virtual reality (VR) workbench and a common software platform. Peg-in-a-hole and Sener electronic box assembly tasks have been used as the task cases to perform human factor experiments, using sixteen subjects. Both objective performance data (task completion time, and human performance error rates) and subjective opinions (questionnaires) have been gathered from these experiments. The results showed that the addition of 3D auditory or visual feedback did introduce an improvement in the virtual assembly task performance. They also indicated that the integrated feedback (visual plus auditory) offered better task performance than any feedback used in isolation. Most of the users preferred the combined feedback to any individual feedback (visual or auditory) or no feedback

    The use of visual and auditory feedback for assembly task performance in a virtual environment

    No full text

    Multimodal feedback cues on manual lifting in virtual environments

    Get PDF
    Improper manipulation of real-world objects increases the risk of developing work- related back injuries. In an effort to reduce such a risk and encourage appropriate lifting and moving methods, a Virtual Environment (VE) was employed. Virtual simulations can be used for ergonomic analysis. In this work, the VEs made use of multiple feedback techniques to allow a person to estimate the forces acting on their lower back. A person's head and hand movements were tracked in real-time whilst manipulating an object. A NIOSH lifting equation was used to calculate and determine the Lifting Index whereby the results were conveyed in real time. Visual display feedback techniques were designed and the effect of cues to enhance user performance was experimentally evaluated. The feedback cues provide the user with information about the forces acting on their lower back as they perform manual lifting tasks in VEs. Four different methods were compared and contrasted: No Feedback, Text, Colour and Combined Colour and Text. This work also investigated various types of auditory feedback technique to support object manipulation in VEs. Auditory feedback has been demonstrated to convey information in computer applications effectively, but little work has been reported on the efficacy of such techniques, particularly for ergonomic design. Four different methods were compared and contrasted: No Feedback, White-noise, Pitch and Tempo. A combined Audio-Visual (AV) technique was also examined by mixing both senses. The effect of Tactile Augmentation was also examined. Three different weights (real) were used and the results obtained by experiment were compared with the experiment using virtual weights in order to evaluate whether or not the presence of a real weighted object enhanced people's sense of realism. The goals of this study were to explore various senses of feedback technique (visual, auditory and tactile), compare the performance characteristics of each technique and understand their relative advantages and drawbacks

    Exploring the Influence of Haptic Force Feedback on 3D Selection

    Get PDF
    This thesis studies the effects of haptic force feedback on 3D interaction performance. To date, Human-Computer Interaction (HCI) in three dimensions is not well understood. Within platforms, such as Immersive Virtual Environments (IVEs), implementing `good' methods of interaction is difficult. As reflected by the lack of 3D IVE applications in common use, typical performance constraints include inaccurate tracking, lack of additional sensory inputs, in addition to general design issues related to the implemented interaction technique and connected input devices. In total, this represents a broad set of multi-disciplinary challenges. By implementing techniques that address these problems, we intend to use IVE platforms to study human 3D interaction and the effects of different types of feedback. A promising area of work is the development of haptic force feedback devices. Also called haptic interfaces, these devices can exert a desired force onto the user simulating a physical interaction. When described as a sensory cue, it is thought that this information is important for the selection and manipulation of 3D objects. To date, there are a lot of studies investigating how best to integrate haptic devices within IVEs. Whilst there are still fundamental integration and device level problems to solve, previous work demonstrates that haptic force feedback can improve 3D interaction performance. By investigating this claim further, this thesis explores the role of haptic force feedback on 3D interaction performance in more detail. In particular, we found additional complexities whereby different types of haptic force feedback conditions can either help but also hinder user performance. By discussing these new results, we begin to examine the utility of haptic force feedback. By focusing our user studies on 3D selection, we explored the influence of haptic force feedback on the strategies taken to target virtual objects when using either `distal' and `natural' interaction technique designs. We first outlined novel methods for integrating and calibrating large scale haptic devices within a CAVE-like IVE. Secondly, we described our implementation of distal and natural selection techniques tailored to the available hardware, including the collision detection mechanisms used to render different haptic responses. Thirdly, we discussed the evaluation framework used to assess different interaction techniques and haptic force feedback responses within a common IVE setup. Finally, we provide a detailed assessment of user performance highlighting the effects of haptic force feedback on 3D selection, which is the main contribution of this work. We expect the presented findings will add to the existing literature that evaluates novel 3D interaction technique designs for IVEs. We also hope that this thesis will provide a basis to develop future interaction models that include the effects of haptic force feedback
    corecore