28 research outputs found

    Altered Auditory and Multisensory Temporal Processing in Autism Spectrum Disorders

    Get PDF
    Autism spectrum disorders (ASD) are characterized by deficits in social reciprocity and communication, as well as by repetitive behaviors and restricted interests. Unusual responses to sensory input and disruptions in the processing of both unisensory and multisensory stimuli also have been reported frequently. However, the specific aspects of sensory processing that are disrupted in ASD have yet to be fully elucidated. Recent published work has shown that children with ASD can integrate low-level audiovisual stimuli, but do so over an extended range of time when compared with typically developing (TD) children. However, the possible contributions of altered unisensory temporal processes to the demonstrated changes in multisensory function are yet unknown. In the current study, unisensory temporal acuity was measured by determining individual thresholds on visual and auditory temporal order judgment (TOJ) tasks, and multisensory temporal function was assessed through a cross-modal version of the TOJ task. Whereas no differences in thresholds for the visual TOJ task were seen between children with ASD and TD, thresholds were higher in ASD on the auditory TOJ task, providing preliminary evidence for impairment in auditory temporal processing. On the multisensory TOJ task, children with ASD showed performance improvements over a wider range of temporal intervals than TD children, reinforcing prior work showing an extended temporal window of multisensory integration in ASD. These findings contribute to a better understanding of basic sensory processing differences, which may be critical for understanding more complex social and cognitive deficits in ASD, and ultimately may contribute to more effective diagnostic and interventional strategies

    An extended multisensory temporal binding window in autism spectrum disorders

    Get PDF
    Autism spectrum disorders (ASD) form a continuum of neurodevelopmental disorders, characterized by deficits in communication and reciprocal social interaction, as well as by repetitive behaviors and restricted interests. Sensory disturbances are also frequently reported in clinical and autobiographical accounts. However, surprisingly few empirical studies have characterized the fundamental features of sensory and multisensory processing in ASD. The current study is structured to test for potential differences in multisensory temporal function in ASD by making use of a temporally dependent, low-level multisensory illusion. In this illusion, the presentation of a single flash of light accompanied by multiple sounds often results in the illusory perception of multiple flashes. By systematically varying the temporal structure of the audiovisual stimuli, a “temporal window” within which these stimuli are likely to be bound into a single perceptual entity can be defined. The results of this study revealed that children with ASD report the flash-beep illusion over an extended range of stimulus onset asynchronies relative to children with typical development, suggesting that children with ASD have altered multisensory temporal function. These findings provide valuable new insights into our understanding of sensory processing in ASD and may hold promise for the development of more sensitive diagnostic measures and improved remediation strategies

    Bayesian causal inference modeling of attentional effects on the temporal binding window of multisensory integration

    No full text
    In order to understand the world around us, we combine information across the different senses. This multisensory integration is highly dependent on the temporal relationship between unisensory events and our brain’s ability to discern small timing differences between stimuli (crossmodal temporal acuity). Our previous research found that increasing both visual and auditory perceptual load led to sharp declines in participants’ crossmodal temporal acuity. Previous research in other labs has demonstrated that the brain integrates multisensory information in a Bayes’ optimal way and that the integration of temporally disparate audiovisual stimuli can be modeled using Bayesian causal inference modeling. The present study investigates the influence of visual and auditory perceptual load on the integration of simple stimuli using Bayesian modeling. Participants completed a simultaneity judgment (SJ) task during which they determined whether temporally offset flash-beep stimuli occurred (a)synchronously. In addition, participants completed the SJ task alone (distractor free; DF), in the presence of task-irrelevant visual or auditory distractors (no load; NL), and while completing a concurrent visual or auditory distractor task (high load; HL). Data was modeled using the causal inference model derived in Magnotti et al. 2013, which is based on Bayesian statistics. Our preliminary data show an increase in the temporal binding window for both visual and auditory NL and HL as compared to DF conditions, indicating that the presence of extraneous stimuli enlarge the temporal binding window. Sensory noise increased in the visual and auditory HL conditions as compared to the DF and NL. Similarly, the prior likelihood to assume synchronicity (prior) decreased only when participants attended to the distractors (HL). These preliminary findings indicate that attention alters both low-level (sensory noise) and high-level (priors) processing of simple multisensory stimuli and that our previously observed effects of attention multisensory temporal processing are generalizabl

    An Electroencephalography Investigation of the Differential Effects of Visual versus Auditory Attention on Crossmodal Temporal Acuity

    No full text
    Our perception of the world hinges on our ability to accurately combine the many stimuli in our environment. This multisensory integration is highly dependent on the temporal relationship between unisensory events and our brain\u27s ability to discern small timing differences between stimuli (crossmodal temporal acuity). Our previous research investigated whether attention alters crossmodal temporal acuity using a crossmodal temporal order judgment (CTOJ) task in which participants were asked to report if a flash or beep occurring at different time intervals appeared first while concurrently completing either a visual distractor or auditory distractor task. We found that increasing the perceptual load of both distractor tasks led to sharp declines in participants\u27 crossmodal temporal acuity. The current study uses electroencephalography (EEG) to understand the neural mechanisms that lead to decreased crossmodal temporal acuity. Participants completed a CTOJ task in association with a visual distractor task, as described above, while EEG activity was recorded from 64 scalp electrodes. EEG activity was averaged based on the onset of the flash, producing an event-related potential (ERP) waveform for each perceptual load level and stimulus onset asynchrony (SOA) combination. Preliminary data analysis suggests that increasing perceptual load most strongly influences the amplitude of the N1/P2 complex in response to the flash across parietal electrodes. This suggests that decreases in crossmodal temporal acuity with increasing visual load may be mediated by alterations in visual processing. Ongoing data collection investigates whether increasing auditory load will lead to alterations in auditory processing, thus suggesting a modality-specific mechanism for disruptions in crossmodal temporal acuity. This line of research serves to illuminate the neural networks that underlie the interaction between attention and multisensory integration

    Audiovisual Integration Varies With Target and Environment Richness in Immersive Virtual Reality

    No full text
    We are continually bombarded by information arriving to each of our senses; however, the brain seems to effortlessly integrate this separate information into a unified percept. Although multisensory integration has been researched extensively using simple computer tasks and stimuli, much less is known about how multisensory integration functions in real-world contexts. Additionally, several recent studies have demonstrated that multisensory integration varies tremendously across naturalistic stimuli. Virtual reality can be used to study multisensory integration in realistic settings because it combines realism with precise control over the environment and stimulus presentation. In the current study, we investigated whether multisensory integration as measured by the redundant signals effects (RSE) is observable in naturalistic environments using virtual reality and whether it differs as a function of target and/ or environment cue-richness. Participants detected auditory, visual, and audiovisual targets which varied in cue-richness within three distinct virtual worlds that also varied in cue-richness. We demonstrated integrative effects in each environment-by-target pairing and further showed a modest effect on multisensory integration as a function of target cue-richness but only in the cue-rich environment. Our study is the first to definitively show that minimal and more naturalistic tasks elicit comparable redundant signals effects. Our results also suggest that multisensory integration may function differently depending on the features of the environment. The results of this study have important implications in the design of virtual multisensory environments that are currently being used for training, educational, and entertainment purposes

    Temporal Multisensory Processing and its Relationship to Autistic Functioning

    No full text
    Autism spectrum disorders (ASD) form a continuum of neurodevelopmental disorders characterized by deficits in communication and reciprocal social interaction, repetitive behaviors, and restricted interests. Sensory disturbances are also frequently reported in clinical and autobiographical accounts. However, few empirical studies have characterized the fundamental features of sensory and multisensory processing in ASD. Recently published studies have shown that children with ASD are able to integrate low-level multisensory stimuli, but do so over an enlarged temporal window when compared with typically developing (TD) children. The current study sought to expand upon these previous findings by examining differences in the temporal processing of low-level multisensory stimuli in high-functioning (HFA) and low-functioning (LFA) children with ASD in the context of a simple reaction time task. Contrary to these previous findings, children with both HFA and LFA showed smaller gains in performance under multisensory (ie, combined visual-auditory) conditions when compared with their TD peers. Additionally, the pattern of performance gains as a function of SOA was similar across groups, suggesting similarities in the temporal processing of these cues that run counter to previous studies that have shown an enlarged “temporal window.” These findings add complexity to our understanding of the multisensory processing of low-level stimuli in ASD and may hold promise for the development of more sensitive diagnostic measures and improved remediation strategies in autism
    corecore