Spatial sound in auditory vision substitution systems

Abstract

Current auditory vision sensory substitution (AVSS) systems might be improved by the direct mapping of an image into a matrix of concurrently active sound sources in a virtual acoustic space. This mapping might be similar to the existing techniques for tactile substitution of vision where point arrays are successfully used. This paper gives an overview of the current auditory displays used to sonify 2D visual information and discuss the feasibility of new perceptually motivated AVSS methods encompassing spatial sound

    Similar works