148 research outputs found
Increasing the motion of users in photo-realistic virtual environments by utilising auditory rendering of the environment and ego-motion
An occurring problem of the image-based-rendering technology for Virtual Environments has been that subjects in general showed very little movement of head and body. Our hypothesis is that the movement rate could be enhanced by introducing the auditory modality. In the study described in this paper, 126 subjects participated in a between-subjects experiment involving six different experimental conditions, including both uni-and bi-modal stimuli (auditory and visual). The aim of the study was to investigate the influence of auditory rendering in stimulating and enhancing subjects ’ motion in virtual reality. The auditory stimuli consisted of several combinations of auditory feedback, including static sound sources as well as self-induced sounds. Results show that motion in virtual reality is significantly enhanced when moving sound sources and sound of ego-motion are rendered in the environment. 1
Self-induced Footsteps Sounds in Virtual Reality: Latency, Recognition, Quality and Presence
In this paper we describe the results of experiments whose goal is to investigate the effect of enhancing a virtual reality experience with the sound of synthetic footsteps. Results show that the sense of presence is enhanced when the sound of one’s own motion is added. Furthermore, the experiments show that the threshold for detection of latency between motion and sound is raised when visual stimuli is introduced. 1
Using problem based learning to support transdisciplinarity in an HCI education
In this paper we advocate the development of transdisci-plinary educational programs with a strong focus on HCI, which use problem based learning (PBL) as a teaching method-ology. We describe a novel education called Medialogy, developed at Aalborg University Copenhagen in Denmark, outlining through different case studies how PBL supports transdisci-plinarity
- …