24 research outputs found
GeoBoids: A Mobile AR Application for Exergaming
“© © 2012 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.”We have designed a mobile Augmented Reality (AR) game which incorporates video see-through and spatialized audio AR techniques and encourages player movement in the real world. In the game, called GeoBoids, the player is surrounded by flocks of virtual creatures that are visible and audible through mobile AR application. The goal is for the player to run to the location of a GeoBoid swarm in the real world, capture all the creatures there, then run to the next swarm and repeat, before time runs out, encouraging the player to exercise during game play. The most novel elements of the game are the use of audio input and output for interacting with the creatures. The interface design of the game includes AR visualization, spatialized audio, touch gestures and whistle interaction. Feedback from users in a preliminary user study was mostly positive on overall game play and the design of the UI, while the results also revealed improvements were needed for whistle interaction and the visual design of the GeoBoids
Proceedings - IEEE Virtual Reality: Message from the general chairs
10.1109/VR.2011.5759421Proceedings - IEEE Virtual Realityx
GeoBoids: A Mobile AR Application for Exergaming
“© © 2012 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.”We have designed a mobile Augmented Reality (AR) game which incorporates video see-through and spatialized audio AR techniques and encourages player movement in the real world. In the game, called GeoBoids, the player is surrounded by flocks of virtual creatures that are visible and audible through mobile AR application. The goal is for the player to run to the location of a GeoBoid swarm in the real world, capture all the creatures there, then run to the next swarm and repeat, before time runs out, encouraging the player to exercise during game play. The most novel elements of the game are the use of audio input and output for interacting with the creatures. The interface design of the game includes AR visualization, spatialized audio, touch gestures and whistle interaction. Feedback from users in a preliminary user study was mostly positive on overall game play and the design of the UI, while the results also revealed improvements were needed for whistle interaction and the visual design of the GeoBoids
Comparing aimed movements in the real world and in virtual reality
The study of aimed movements has a long history, starting at least as far back as 1899 when Wood-worth proposed a two-component model in which aimed movements are broken into an initial ballistic phase and an additional control phase. In this paper, we use Wood-worth's model for experimentally comparing aimed movements in the real world with those in a virtual environment. Trajectories from real world movements have been collected and compared to trajectories of movements taken from a virtual environment. From this, we show that significant temporal differences arise in both the ballistic and control phases, but the difference is much larger in the control phase; users' improvement is relatively greater in the virtual world than in the real world. They progress more in ballistic phase in the real world, but more in correction phase in the virtual world. These results allow us to better understand the pointing tasks in virtual environments
An image-warping architecture for VR : low latency versus image quality
Designing low end-to-end latency system architectures for virtual reality is still an open and challenging problem. We describe the design, implementation and evaluation of a client-server depth-image warping architecture that updates and displays the scene graph at the refresh rate of the display. Our approach works for scenes consisting of dynamic and interactive objects. The end-to-end latency is minimized as well as smooth object motion generated. However, this comes at the expense of image quality inherent to warping techniques. We evaluate the architecture and its design trade-offs by comparing latency and image quality to a conventional rendering system. Our experience with the system confirms that the approach facilitates common interaction tasks such as navigation and object manipulation