174 research outputs found

    Neural Correlates of Multisensory Enhancement in Audiovisual Narrative Speech Perception: A fMRI investigation

    Get PDF
    This fMRI study investigated the effect of seeing articulatory movements of a speaker while listening to a nat- uralistic narrative stimulus. It had the goal to identify regions of the language network showing multisensory enhancement under synchronous audiovisual conditions. We expected this enhancement to emerge in regions known to underlie the integration of auditory and visual information such as the posterior superior temporal gyrus as well as parts of the broader language network, including the semantic system. To this end we presented 53 participants with a continuous narration of a story in auditory alone, visual alone, and both synchronous and asynchronous audiovisual speech conditions while recording brain activity using BOLD fMRI. We found multi- sensory enhancement in an extensive network of regions underlying multisensory integration and parts of the semantic network as well as extralinguistic regions not usually associated with multisensory integration, namely the primary visual cortex and the bilateral amygdala. Analysis also revealed involvement of thalamic brain regions along the visual and auditory pathways more commonly associated with early sensory processing. We conclude that under natural listening conditions, multisensory enhancement not only involves sites of multisensory in- tegration but many regions of the wider semantic network and includes regions associated with extralinguistic sensory, perceptual and cognitive processing

    Grabbing your ear: rapid auditory-somatosensory multisensory interactions in low-level sensory cortices are not constrained by stimulus alignment.

    Get PDF
    Multisensory interactions are observed in species from single-cell organisms to humans. Important early work was primarily carried out in the cat superior colliculus and a set of critical parameters for their occurrence were defined. Primary among these were temporal synchrony and spatial alignment of bisensory inputs. Here, we assessed whether spatial alignment was also a critical parameter for the temporally earliest multisensory interactions that are observed in lower-level sensory cortices of the human. While multisensory interactions in humans have been shown behaviorally for spatially disparate stimuli (e.g. the ventriloquist effect), it is not clear if such effects are due to early sensory level integration or later perceptual level processing. In the present study, we used psychophysical and electrophysiological indices to show that auditory-somatosensory interactions in humans occur via the same early sensory mechanism both when stimuli are in and out of spatial register. Subjects more rapidly detected multisensory than unisensory events. At just 50 ms post-stimulus, neural responses to the multisensory 'whole' were greater than the summed responses from the constituent unisensory 'parts'. For all spatial configurations, this effect followed from a modulation of the strength of brain responses, rather than the activation of regions specifically responsive to multisensory pairs. Using the local auto-regressive average source estimation, we localized the initial auditory-somatosensory interactions to auditory association areas contralateral to the side of somatosensory stimulation. Thus, multisensory interactions can occur across wide peripersonal spatial separations remarkably early in sensory processing and in cortical regions traditionally considered unisensory

    Atypical multisensory integration in Niemann-Pick type C disease – towards potential biomarkers

    Get PDF
    Background: Niemann-Pick type C (NPC) is an autosomal recessive disease in which cholesterol and glycosphingolipids accumulate in lysosomes due to aberrant cell-transport mechanisms. It is characterized by progressive and ultimately terminal neurological disease, but both pre-clinical studies and direct human trials are underway to test the safety and efficacy of cholesterol clearing compounds, with good success already observed in animal models. Key to assessing the effectiveness of interventions in patients, however, is the development of objective neurobiological outcome measures. Multisensory integration mechanisms present as an excellent candidate since they necessarily rely on the fidelity of long-range neural connections between the respective sensory cortices (e.g. the auditory and visual systems). Methods: A simple way to test integrity of the multisensory system is to ask whether individuals respond faster to the occurrence of a bisensory event than they do to the occurrence of either of the unisensory constituents alone. Here, we presented simple auditory, visual, and audio-visual stimuli in random sequence. Participants responded as fast as possible with a button push. One 11-year-old and two 14-year-old boys with NPC participated in the experiment and their results were compared to those of 35 age-matched neurotypical boys. Results: Reaction times (RTs) to the stimuli when presented simultaneously were significantly faster than when they were presented alone in the neurotypical children, a facilitation that could not be accounted for by probability summation, as evidenced by violation of the so-called ‘race’ model. In stark contrast, the NPC boys showed no such speeding, despite the fact that their unisensory RTs fell within the distribution of RTs observed in the neurotypicals. Conclusions: These results uncover a previously undescribed deficit in multisensory integrative abilities in NPC, with implications for ongoing treatment of the clinical symptoms of these children. They also suggest that multisensory processes may represent a good candidate biomarker against which to test the efficacy of therapeutic interventions

    Notes From an Epicenter: Navigating Behavioral Clinical Trials on Autism Spectrum Disorder Amid the COVID-19 Pandemic in the Bronx

    Get PDF
    Background: The COVID-19 pandemic impacted nearly all facets of our daily lives, and clinical research was no exception. Here, we discuss the impact of the pandemic on our ongoing, three-arm randomized controlled trial (RCT) Sensory Integration Therapy (SIT) in Autism: Mechanisms and Effectiveness (NCT02536365), which investigates the immediate and sustained utility of SIT to strengthen functional daily-living skills and minimize the presence of maladaptive sensory behaviors in autistic children. Main text: In this text, we detail how we navigated the unique challenges that the pandemic brought forth between the years 2020 and 2021, including the need to rapidly adjust our study protocol, recruitment strategy, and in-person assessment battery to allow for virtual recruitment and data collection. We further detail how we triaged participants and allocated limited resources to best preserve our primary outcome measures while prioritizing the safety of our participants and study team. We specifically note the importance of open and consistent communication with all participating families throughout the pandemic in ensuring all our protocol adjustments were successfully implemented. Conclusions: Though the COVID-19 pandemic resulted in an unprecedented interruption to in-person clinical research, clinical trials have always been and will continue to be at risk for unforeseen interruptions, whether from world events or participants\u27 personal circumstances. By presenting our steps to preserving this RCT throughout the pandemic, we offer suggestions for successfully managing unexpected interruptions to research. Ideally, by taking these into account, future RCTs may be increasingly prepared to minimize the impact of these potential interruptions to research

    A transition from unimodal to multimodal activations in four sensory modalities in humans: an electrophysiological study

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>To investigate the long-latency activities common to all sensory modalities, electroencephalographic responses to auditory (1000 Hz pure tone), tactile (electrical stimulation to the index finger), visual (simple figure of a star), and noxious (intra-epidermal electrical stimulation to the dorsum of the hand) stimuli were recorded from 27 scalp electrodes in 14 healthy volunteers.</p> <p>Results</p> <p>Results of source modeling showed multimodal activations in the anterior part of the cingulate cortex (ACC) and hippocampal region (Hip). The activity in the ACC was biphasic. In all sensory modalities, the first component of ACC activity peaked 30–56 ms later than the peak of the major modality-specific activity, the second component of ACC activity peaked 117–145 ms later than the peak of the first component, and the activity in Hip peaked 43–77 ms later than the second component of ACC activity.</p> <p>Conclusion</p> <p>The temporal sequence of activations through modality-specific and multimodal pathways was similar among all sensory modalities.</p

    Fast Visuomotor Processing of Redundant Targets: The Role of the Right Temporo-Parietal Junction

    Get PDF
    Parallel processing of multiple sensory stimuli is critical for efficient, successful interaction with the environment. An experimental approach to studying parallel processing in sensorimotor integration is to examine reaction times to multiple copies of the same stimulus. Reaction times to bilateral copies of light flashes are faster than to single, unilateral light flashes. These faster responses may be due to ‘statistical facilitation’ between independent processing streams engaged by the two copies of the light flash. On some trials, however, reaction times are faster than predicted by statistical facilitation. This indicates that a neural ‘coactivation’ of the two processing streams must have occurred. Here we use fMRI to investigate the neural locus of this coactivation. Subjects responded manually to the detection of unilateral light flashes presented to the left or right visual hemifield, and to the detection of bilateral light flashes. We compared the bilateral trials where subjects' reaction times exceeded the limit predicted by statistical facilitation to bilateral trials that did not exceed the limit. Activity in the right temporo-parietal junction was higher in those bilateral trials that showed coactivation than in those that did not. These results suggest the neural coactivation observed in visuomotor integration occurs at a cognitive rather than sensory or motor stage of processing

    Top-down and bottom-up modulation in processing bimodal face/voice stimuli

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Processing of multimodal information is a critical capacity of the human brain, with classic studies showing bimodal stimulation either facilitating or interfering in perceptual processing. Comparing activity to congruent and incongruent bimodal stimuli can reveal sensory dominance in particular cognitive tasks.</p> <p>Results</p> <p>We investigated audiovisual interactions driven by stimulus properties (bottom-up influences) or by task (top-down influences) on congruent and incongruent simultaneously presented faces and voices while ERPs were recorded. Subjects performed gender categorisation, directing attention either to faces or to voices and also judged whether the face/voice stimuli were congruent in terms of gender. Behaviourally, the unattended modality affected processing in the attended modality: the disruption was greater for attended voices. ERPs revealed top-down modulations of early brain processing (30-100 ms) over unisensory cortices. No effects were found on N170 or VPP, but from 180-230 ms larger right frontal activity was seen for incongruent than congruent stimuli.</p> <p>Conclusions</p> <p>Our data demonstrates that in a gender categorisation task the processing of faces dominate over the processing of voices. Brain activity showed different modulation by top-down and bottom-up information. Top-down influences modulated early brain activity whereas bottom-up interactions occurred relatively late.</p

    Neural correlates of audiovisual motion capture

    Get PDF
    Visual motion can affect the perceived direction of auditory motion (i.e., audiovisual motion capture). It is debated, though, whether this effect occurs at perceptual or decisional stages. Here, we examined the neural consequences of audiovisual motion capture using the mismatch negativity (MMN), an event-related brain potential reflecting pre-attentive auditory deviance detection. In an auditory-only condition occasional changes in the direction of a moving sound (deviant) elicited an MMN starting around 150 ms. In an audiovisual condition, auditory standards and deviants were synchronized with a visual stimulus that moved in the same direction as the auditory standards. These audiovisual deviants did not evoke an MMN, indicating that visual motion reduced the perceptual difference between sound motion of standards and deviants. The inhibition of the MMN by visual motion provides evidence that auditory and visual motion signals are integrated at early sensory processing stages

    Neural responses in parietal and occipital areas in response to visual events are modulated by prior multisensory stimuli

    Get PDF
    The effect of multi-modal vs uni-modal prior stimuli on the subsequent processing of a simple flash stimulus was studied in the context of the audio-visual 'flash-beep' illusion, in which the number of flashes a person sees is influenced by accompanying beep stimuli. EEG recordings were made while combinations of simple visual and audio-visual stimuli were presented. The experiments found that the electric field strength related to a flash stimulus was stronger when it was preceded by a multi-modal flash/beep stimulus, compared to when it was preceded by another uni-modal flash stimulus. This difference was found to be significant in two distinct timeframes--an early timeframe, from 130-160 ms, and a late timeframe, from 300-320 ms. Source localisation analysis found that the increased activity in the early interval was localised to an area centred on the inferior and superior parietal lobes, whereas the later increase was associated with stronger activity in an area centred on primary and secondary visual cortex, in the occipital lobe. The results suggest that processing of a visual stimulus can be affected by the presence of an immediately prior multisensory event. Relatively long-lasting interactions generated by the initial auditory and visual stimuli altered the processing of a subsequent visual stimulus.status: publishe
    corecore