233 research outputs found
An Evaluation of Virtual Lenses for Object Selection in Augmented Reality
This paper reports the results of an experiment to compare three different selection techniques in a tabletop tangible augmented reality interface. Object selection is an important task in all direct manipulation
interfaces because it precedes most other manipulation
and navigation actions. Previous work on tangible virtual lenses for visualisation has prompted the exploration of how selection techniques can be incorporated into these tools. In this paper a selection technique based on virtual lenses is compared with the traditional approaches of virtual hand and virtual pointer methods. The Lens technique is found to be faster, require less physical effort to use, and is preferred by participants over the other techniques. These results can be useful in guiding the development of future augmented reality interfaces
An Augmented Reality Human-Robot Collaboration System
InvitedThis article discusses an experimental comparison of three user interface techniques for interaction with a remotely located robot. A typical interface for such a situation is to teleoperate the robot using a camera that displays the robot's view of its work environment. However, the operator often has a difficult time maintaining situation awareness due to this single egocentric view. Hence, a multimodal system was developed enabling the human operator to view the robot in its remote work environment through an augmented reality interface, the augmented reality human-robot collaboration (AR-HRC) system. The operator uses spoken dialogue, reaches into the 3D representation of the remote work environment and discusses intended actions of the robot. The result of the comparison was that the AR-HRC interface was found to be most effective, increasing accuracy by 30%, while reducing the number of close calls in operating the robot by factors of ~3x. It thus provides the means to maintain spatial awareness and give the users the feeling of working in a true collaborative environment
Multi-scale gestural interaction for augmented reality
We present a multi-scale gestural interface for augmented reality applications. With virtual objects, gestural interactions such as pointing and grasping can be convenient and intuitive, however they are imprecise, socially awkward, and susceptible to fatigue. Our prototype application uses multiple sensors to detect gestures from both arm and hand motions (macro-scale), and finger gestures (micro-scale). Micro-gestures can provide precise input through a belt-worn sensor configuration, with the hand in a relaxed posture. We present an application that combines direct manipulation with microgestures for precise interaction, beyond the capabilities of direct manipulation alone.Postprin
CollabAR - Investigating the Mediating Role of Mobile AR Interfaces on Co-Located Group Collaboration
Mobile Augmented Reality (AR) technology is enabling new applications for different domains including architecture, education or medical work. As AR interfaces project digital data, information and models into the real world, it allows for new forms of collaborative work. However, despite the wide availability of AR applications, very little is known about how AR interfaces mediate and shape collaborative practices. This paper presents a study which examines how a mobile AR (M-AR) interface for inspecting and discovering AR models of varying complexity impacts co-located group practices. We contribute new insights into how current mobile AR interfaces impact co-located collaboration. Our results show that M-AR interfaces induce high mental load and frustration, cause a high number of context switches between devices and group discussion, and overall leads to a reduction in group interaction. We present design recommendations for future work focusing on collaborative AR interfaces
Factors influencing the acceptance of augmented reality in education: a review of the literature
This paper presents a review of user expectation towards Augmented Reality (AR) and the acceptance of AR for technologyenhanced teaching and learning. Augmented Reality is a technology that superimposes a computer-generated image over a user’s view of the real world, thus providing a composite view. This technology has been used in many fields such as marketing, military, entertainment and many other sectors. Studies have found that AR technology can enhance teaching and learning, however more research still needs to be conducted about the acceptance of AR as a learning tool and what users in education expect from the technology. An understanding of the user expectation is one of the key foundations towards establishing better-designed AR systems and applications that will result in more acceptance of this technology. To help with this, this paper reviews previous research on user expectations of AR in education and its acceptance
Assessing the Suitability and Effectiveness of Mixed Reality Interfaces for Accurate Robot Teleoperation
In this work, a Mixed Reality (MR) system is evaluated to assess whether it can be efficiently used in teleoperation tasks that require an accurate control of the robot end-effector. The robot and its local environment are captured using multiple RGB-D cameras, and a remote user controls the robot arm motion through Virtual Reality (VR) controllers. The captured data is streamed through the network and reconstructed in 3D, allowing the remote user to monitor the state of execution in real time through a VR headset. We compared our method with two other interfaces: i) teleoperation in pure VR, with the robot model rendered with the real joint states, and ii) teleoperation in MR, with the rendered model of the robot superimposed on the actual point cloud data. Preliminary results indicate that the virtual robot visualization is better than the pure point cloud for accurate teleoperation of a robot arm
- …