Skip to main content
Article thumbnail
Location of Repository

Maintaining frame rate perception in interactive environments by exploiting audio-visual cross-modal interaction

By V. Hulusic, K. Debattista, V. Aggarwal and A. Chalmers


The entertainment industry, primarily the\ud video games industry, continues to dictate the development\ud and performance requirements of graphics hardware\ud and computer graphics algorithms. However, despite\ud the enormous progress in the last few years it is\ud still not possible to achieve some of industry’s demands,\ud in particular high-fidelity rendering of complex scenes\ud in real-time, on a single desktop machine. A realisation\ud that sound/music and other senses are important to entertainment,\ud led to an investigation of alternative methods,\ud such as cross-modal interaction in order to try and\ud achieve the goal of “realism in real-time”. In this paper\ud we investigate the cross-modal interaction between\ud vision and audition for reducing the amount of computation\ud required to compute visuals by introducing\ud movement related sound effects. Additionally, we look\ud at the effect of camera movement speed on temporal visual\ud perception. Our results indicate that slow animations\ud are perceived as smoother than fast animations.\ud Furthermore, introducing the sound effect of footsteps to walking animations further increased the animation smoothness perception. This has the consequence that for certain conditions the number of frames that need to be rendered each second can be reduced, saving valuable computation time, without the viewer being aware of this reduction. The results presented are another step towards the full understanding of the auditory-visual cross-modal interaction and its importance for helping achieve “realism int real-time”

Year: 2011
OAI identifier:

Suggested articles

To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.