A basic principle of multisensory research posits that our ability to make adaptive responses arises from the combined action of our senses in producing a coherent multimodal representation of the external world by exploiting redundancies. This beneficial multisensory phenomenon, called the “redundancy effect”, manifests behaviourally as faster responses when two redundant stimuli (from different senses) appear together, as compared to when a single stimulus appears. Despite extensive research has repeatedly shown this effect, it is not clear whether it is modulated by the individual’s level of alertness, and if so (as hypothesised), to what extent it affects early sensory integration or later perceptual processing. In this experiment, we will combine computational modelling and EEG to determine whether audio-tactile interactions occur via similar spatiotemporal neural mechanisms under reduced alertness