AED: a novel visual representation based on AR and empathy computing in manual assembly

Abstract

Adding personal perceptions to manufacturing engineering can be very challenging, especially when engineering-based methods are used to make artisans understand the designer's ideas. Unfortunately, the two-dimensional engineering definition can be extremely time-consuming for individuals who lack creativity or imagination, and model-based definition would be incapable of breaking the separation between the virtual space and the real world, which makes the interaction between natural persons exist spatial perception error. The emergence of Augmented Reality (AR), which allows individuals to perceive the intentions and strategies of the designer with visual cues that are attached to actual objects, fills this gap. In this paper, augmented engineering definition (AED) is proposed to enhance the information exchange between natural persons in a succinct, accurate and acceptable form of visual impression. Motivated by visual representation in remote collaboration, the specific empathy scenario bases on the AED design, which leads to the establishment of a mapping relationship between the visual cues and the augmented information. An inquiry had been conducted by involving participants who were paired up for the parts’ inspection, interacting via 2D visualization data only, interaction with 3D projection data, interaction with 3D visualization data, AED-based communication. The experimental results showed that participants with AED exhibited higher situational appeal and information understanding than using three other interactions. Besides, we discussed the feasibility of using AED in a collaborative manufacturing environment and the impact on AED users

    Similar works

    Full text

    thumbnail-image

    Available Versions