Location of Repository

Exploiting audio-visual cross-modal interaction to reduce computational requirements in interactive environments

By V. Hulusic, K. Debattista, V. Aggarwal and A. Chalmers

Abstract

The quality of real-time computer graphics has\ud progressed enormously in the last decade due to the rapid\ud development in graphics hardware and its utilisation of new\ud algorithms and techniques. The computer games industry, with\ud its substantial software and hardware requirements, has been\ud at the forefront in pushing these developments. Despite all the\ud advances, there is still a demand for even more computational\ud resources. For example, sound effects are an integral part of most\ud computer games. This paper presents a method for reducing the\ud amount of effort required to compute the computer graphics\ud aspects of a game by exploiting movement related sound effects.\ud We conducted a detailed psychophysical experiment investigating\ud how camera movement speed and the sounds affect the perceived\ud smoothness of an animation. The results show that walking\ud (slow) animations were perceived as smoother than running (fast)\ud animations. We also found that the addition of sound effects,\ud such as footsteps, to a walking/running animation affects the\ud animation smoothness perception. This entails that for certain\ud conditions the number of frames that need to be rendered each\ud second can be reduced saving valuable computation time. Our\ud approach will enable the computed frame rate to be decreased,\ud and thus the computational requirements to be lowered, without\ud any perceivable visual loss of qualit

Year: 2010
OAI identifier: oai:eprints.bournemouth.ac.uk:30377

Suggested articles

Preview


To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.