4,763 research outputs found
Analyzing the Impact of Spatio-Temporal Sensor Resolution on Player Experience in Augmented Reality Games
Along with automating everyday tasks of human life, smartphones have become one of the most popular devices to play video games on due to their interactivity. Smartphones are embedded with various sensors which enhance their ability to adopt new new interaction techniques for video games. These integrated sen- sors, such as motion sensors or location sensors, make the device able to adopt new interaction techniques that enhance usability. However, despite their mobility and embedded sensor capacity, smartphones are limited in processing power and display area compared to desktop computer consoles. When it comes to evaluat- ing Player Experience (PX), players might not have as compelling an experience because the rich graphics environments that a desktop computer can provide are absent on a smartphone. A plausible alternative in this regard can be substituting the virtual game world with a real world game board, perceived through the device camera by rendering the digital artifacts over the camera view. This technology is widely known as Augmented Reality (AR).
Smartphone sensors (e.g. GPS, accelerometer, gyro-meter, compass) have enhanced the capability for deploying Augmented Reality technology. AR has been applied to a large number of smartphone games including shooters, casual games, or puzzles. Because AR play environments are viewed through the camera, rendering the digital artifacts consistently and accurately is crucial because the digital characters need to move with respect to sensed orientation, then the accelerometer and gyroscope need to provide su ciently accurate and precise readings to make the game playable. In particular, determining the pose of the camera in space is vital as the appropriate angle to view the rendered digital characters are determined by the pose of the camera. This defines how well the players will be able interact with the digital game characters. Depending in the Quality of Service (QoS) of these sensors, the Player Experience (PX) may vary as the rendering of digital characters are affected by noisy sensors causing a loss of registration. Confronting such problem while developing AR games is di cult in general as it requires creating wide variety of game types, narratives, input modalities as well as user-testing. Moreover, current AR games developers do not have any specific guidelines for developing AR games, and concrete guidelines outlining the tradeoffs between QoS and PX for different genres and interaction techniques are required.
My dissertation provides a complete view (a taxonomy) of the spatio-temporal sensor resolution depen- dency of the existing AR games. Four user experiments have been conducted and one experiment is proposed to validate the taxonomy and demonstrate the differential impact of sensor noise on gameplay of different genres of AR games in different aspect of PX. This analysis is performed in the context of a novel instru- mentation technology, which allows the controlled manipulation of QoS on position and orientation sensors. The experimental outcome demonstrated how the QoS of input sensor noise impacts the PX differently while playing AR game of different genre and the key elements creating this differential impact are - the input modality, narrative and game mechanics. Later, concrete guidelines are derived to regulate the sensor QoS as complete set of instructions to develop different genres or AR games
Ball-AR: Fostering Playful Co-Located Interaction Through Environment-centric Physical Activity with AR
We present Ball-AR, an augmented reality (AR) game where two players in the
same physical space attempt to hit each other with virtual dodgeballs overlaid
on the physical world. Researchers have studied AR's potential for fostering
co-located interaction and physical activity; however, they have not
investigated the impacts of physical activity and physical environment on user
experiences and interaction. We created an AR dodgeball game centered around
encouraging physical activity and harnessing the physical environment. We then
evaluated the game with five dyads to analyze the impacts of these design
choices on the quality of gameplay and interaction between players. We found
that physical activity and the shared physical space created memorable
experiences and interactions among participants, although participants desired
a more augmented and immersive experience
Campus Mysteries: Serious Walking Around
The Campus Mysteries project developed an augmented reality game platform called fAR-Play and a learning game called Campus Mysteries with the platform. This paper reports on the development of the platform, the development of the game, and a assessment of the playability of the game. We conclude that augmented reality games are a viable model for learning and that the process of development is itself the site of learning
Mobile Augmented Reality and Language-Related Episodes
Applications of locative media (e.g., placeâbased mobile augmented reality [AR]) are used in various educational content areas and have been shown to provide learners with valuable opportunities for investigationâbased learning, locationâsituated social and collaborative interaction, and embodied experience of place (Squire, 2009; Thorne & Hellermann, 2017; Zheng et al., 2018). Mobile locative media applicationsâ value for language learning, however, remains underinvestigated. To address this lacuna, this study employed the widely used construct of languageârelated episodes (LREs; Swain & Lapkin, 1998) as a unit of analysis to investigate language learning through participation in a mobile AR game. Analysis of videorecorded interactions of four mixedâproficiency groups of game players (two English language learners [ELLs] and one expert speaker of English [ESE] per group) indicates that LREs in this environment were focused on lexical items relevant to the AR tasks and physical locations. Informed by sociocultural theory and conversation analysis, the microgenesis of learnersâ understanding and subsequent use of certain lexical items are indicated in the findings. This understanding of new lexical items was frequently facilitated by ESEsâ assistance and the surrounding physical environment. A strong goal orientation by both ESEs and ELLs was visible, providing implications for taskâbased language teaching approaches
An Inertial Device-based User Interaction with Occlusion-free Object Handling in a Handheld Augmented Reality
Augmented Reality (AR) is a technology used to merge virtual objects with real environments in real-time. In AR, the interaction which occurs between the end-user and the AR system has always been the frequently discussed topic. In addition, handheld AR is a new approach in which it delivers enriched 3D virtual objects when a user looks through the deviceĂąâŹâąs video camera. One of the most accepted handheld devices nowadays is the smartphones which are equipped with powerful processors and cameras for capturing still images and video with a range of sensors capable of tracking location, orientation and motion of the user. These modern smartphones offer a sophisticated platform for implementing handheld AR applications. However, handheld display provides interface with the interaction metaphors which are developed with head-mounted display attached along and it might restrict with hardware which is inappropriate for handheld. Therefore, this paper will discuss a proposed real-time inertial device-based interaction technique for 3D object manipulation. It also explains the methods used such for selection, holding, translation and rotation. It aims to improve the limitation in 3D object manipulation when a user can hold the device with both hands without requiring the need to stretch out one hand to manipulate the 3D object. This paper will also recap of previous works in the field of AR and handheld AR. Finally, the paper provides the experimental results to offer new metaphors to manipulate the 3D objects using handheld devices
- âŠ