Multisensory Perceptual Discrimination in Evolved Networks and Agents

Abstract

The fact that humans and animals have several sensory modalities and use them together to make sense of the world imbues their behaviour with an immense richness and robustness. In this study, recurrent neural networks and minimal agents with active vision are evolved for a perceptual discrimination task (unimodal and bimodal). The purpose of this study is mainly exploratory: to test which of the characteristics of human perceptual discrimination evolve easily (with a focus on statistically optimal integration), how they are realised and what active perception does in this process. Whilst some of the systems evolved to perform perceptual discrimination well, they did not conform to the predictions from statistical optimality. Analyses of the systems point towards a number of relevant issues, noticeably towards the lack of a good account of ‘unimodality’ in existing models of multisensory perception

    Similar works