9,136 research outputs found
Designing ray-pointing using real hand and touch-based in handheld augmented reality for object selection
Augmented Reality (AR) have been widely explored worldwide for their potential as a technology that enhances information representation. As technology progresses, smartphones (handheld devices) now have sophisticated processors and cameras for capturing static photographs and video, as well as a variety of sensors for tracking the user's position, orientation, and motion. Hence, this paper would discuss a finger-ray pointing technique in real-time for interaction in handheld AR and comparing the technique with the conventional technique in handheld, touch-screen interaction. The aim of this paper is to explore the ray pointing interaction in handheld AR for 3D object selection. Previous works in handheld AR and also covers Mixed Reality (MR) have been recapped
Adaptive User Perspective Rendering for Handheld Augmented Reality
Handheld Augmented Reality commonly implements some variant of magic lens
rendering, which turns only a fraction of the user's real environment into AR
while the rest of the environment remains unaffected. Since handheld AR devices
are commonly equipped with video see-through capabilities, AR magic lens
applications often suffer from spatial distortions, because the AR environment
is presented from the perspective of the camera of the mobile device. Recent
approaches counteract this distortion based on estimations of the user's head
position, rendering the scene from the user's perspective. To this end,
approaches usually apply face-tracking algorithms on the front camera of the
mobile device. However, this demands high computational resources and therefore
commonly affects the performance of the application beyond the already high
computational load of AR applications. In this paper, we present a method to
reduce the computational demands for user perspective rendering by applying
lightweight optical flow tracking and an estimation of the user's motion before
head tracking is started. We demonstrate the suitability of our approach for
computationally limited mobile devices and we compare it to device perspective
rendering, to head tracked user perspective rendering, as well as to fixed
point of view user perspective rendering
An Inertial Device-based User Interaction with Occlusion-free Object Handling in a Handheld Augmented Reality
Augmented Reality (AR) is a technology used to merge virtual objects with real environments in real-time. In AR, the interaction which occurs between the end-user and the AR system has always been the frequently discussed topic. In addition, handheld AR is a new approach in which it delivers enriched 3D virtual objects when a user looks through the device’s video camera. One of the most accepted handheld devices nowadays is the smartphones which are equipped with powerful processors and cameras for capturing still images and video with a range of sensors capable of tracking location, orientation and motion of the user. These modern smartphones offer a sophisticated platform for implementing handheld AR applications. However, handheld display provides interface with the interaction metaphors which are developed with head-mounted display attached along and it might restrict with hardware which is inappropriate for handheld. Therefore, this paper will discuss a proposed real-time inertial device-based interaction technique for 3D object manipulation. It also explains the methods used such for selection, holding, translation and rotation. It aims to improve the limitation in 3D object manipulation when a user can hold the device with both hands without requiring the need to stretch out one hand to manipulate the 3D object. This paper will also recap of previous works in the field of AR and handheld AR. Finally, the paper provides the experimental results to offer new metaphors to manipulate the 3D objects using handheld devices
Pre-define rotation amplitudes object rotation in handheld augmented reality
Interaction is one of the important topics to be discussed since it includes the interface where the end-user communicates with the augmented reality (AR) system. In handheld AR interface, the traditional interaction techniques are not suitable for some AR applications due to the different attributes of handheld devices that always refer to smartphones and tablets. Currently interaction techniques in handheld AR are known as touch-based technique, mid-air gesture-based technique and device-based technique that can led to a wide discussion in related research areas. However, this paper will focus to discover the device-based interaction technique because it has proven in the previous studies to be more suitable and robust in several aspects. A novel device-based 3D object rotation technique is proposed to solve the current problem in performing 3DOF rotation of 3D object. The goal is to produce a precise and faster 3D object rotation. Therefore, the determination of the rotation amplitudes per second is required before the fully implementation. This paper discusses the implementation in depth and provides a guideline for those who works in related to device-based interaction
- …