5,322 research outputs found
Altered white matter structure in auditory tracts following early monocular enucleation
Purpose: Similar to early blindness, monocular enucleation (the removal of one eye) early in life results in
crossmodal behavioral and morphological adaptations. Previously it has been shown that partial visual deprivation from early monocular enucleation results in structural white matter changes throughout the visual system (Wong et al., 2018). The current study investigated structural white matter of the auditory system in adults who have undergone early monocular enucleation compared to binocular control participants. Methods: We reconstructed four auditory and audiovisual tracts of interest using probabilistic tractography and compared microstructural properties of these tracts to binocularly intact controls using standard diffusion indices. Results: Although both groups demonstrated asymmetries in indices in intrahemispheric tracts, monocular enucleation participants showed asymmetries opposite to control participants in the auditory and A1-V1 tracts. Monocularenucleation participants also demonstrated significantly lower fractional anisotropy in the audiovisual projections contralateral to the enucleated eye relative to control participants. Conclusions: Partial vision loss from early monocular enucleation results in altered structuralYork University Librarie
Young children do not integrate visual and haptic information
Several studies have shown that adults integrate visual and haptic information (and information from other modalities) in a statistically optimal fashion, weighting each sense according to its reliability. To date no studies have investigated when this capacity for cross-modal integration develops. Here we show that prior to eight years of age, integration of visual and haptic spatial information is far from optimal, with either vision or touch dominating totally, even in conditions where the dominant sense is far less precise than the other (assessed by discrimination thresholds). For size discrimination, haptic information dominates in determining both perceived size and discrimination thresholds, while for orientation discrimination vision dominates. By eight-ten years, the integration becomes statistically optimal, like adults. We suggest that during development, perceptual systems require constant recalibration, for which cross-sensory comparison is important. Using one sense to calibrate the other precludes useful combination of the two sources
Reducing bias in auditory duration reproduction by integrating the reproduced signal
Duration estimation is known to be far from veridical and to differ for sensory estimates and motor reproduction. To investigate how these differential estimates are integrated for estimating or reproducing a duration and to examine sensorimotor biases in duration comparison and reproduction tasks, we compared estimation biases and variances among three different duration estimation tasks: perceptual comparison, motor reproduction, and auditory reproduction (i.e. a combined perceptual-motor task). We found consistent overestimation in both motor and perceptual-motor auditory reproduction tasks, and the least overestimation in the comparison task. More interestingly, compared to pure motor reproduction, the overestimation bias was reduced in the auditory reproduction task, due to the additional reproduced auditory signal. We further manipulated the signal-to-noise ratio (SNR) in the feedback/comparison tones to examine the changes in estimation biases and variances. Considering perceptual and motor biases as two independent components, we applied the reliability-based model, which successfully predicted the biases in auditory reproduction. Our findings thus provide behavioral evidence of how the brain combines motor and perceptual information together to reduce duration estimation biases and improve estimation reliability
The role of the right temporoparietal junction in perceptual conflict: detection or resolution?
The right temporoparietal junction (rTPJ) is a polysensory cortical area that plays a key role in perception and awareness. Neuroimaging evidence shows activation of rTPJ in intersensory and sensorimotor conflict situations, but it remains unclear whether this activity reflects detection or resolution of such conflicts. To address this question, we manipulated the relationship between touch and vision using the so-called mirror-box illusion. Participants' hands lay on either side of a mirror, which occluded their left hand and reflected their right hand, but created the illusion that they were looking directly at their left hand. The experimenter simultaneously touched either the middle (D3) or the ring finger (D4) of each hand. Participants judged, which finger was touched on their occluded left hand. The visual stimulus corresponding to the touch on the right hand was therefore either congruent (same finger as touch) or incongruent (different finger from touch) with the task-relevant touch on the left hand. Single-pulse transcranial magnetic stimulation (TMS) was delivered to the rTPJ immediately after touch. Accuracy in localizing the left touch was worse for D4 than for D3, particularly when visual stimulation was incongruent. However, following TMS, accuracy improved selectively for D4 in incongruent trials, suggesting that the effects of the conflicting visual information were reduced. These findings suggest a role of rTPJ in detecting, rather than resolving, intersensory conflict
Aging in Multisensory Integration
Multisensory integration is the simultaneous processing of multiple sensory inputs into a single percept. The current study aims to further the understanding of multisensory integration across development and the individual contributions of visual and auditory information. Integration was observed using the Sound-Induced Flash Illusion task. In the first experiment, young children, young adults, and older adults participated in a variant of the Sound-Induced Flash Illusion, and found that auditory input had a stronger effect on visual processing than vice versa, and this effect increased with age. Experiment 2 used a similar version of the Sound-Induced Flash Illusion task on young adults, but half of the stimuli were lowered to just above threshold to test if weakened auditory and visual stimuli could account for increased multisensory integration in older adults. It was observed that lowering intensity to above threshold resulted in decreased integration effects. The findings of the current study support auditory dominance literature and the modality appropriateness hypothesis and have implications for many tasks that require the processing of multisensory information.No embargoAcademic Major: Psycholog
Audio-visual detection benefits in the rat
Human psychophysical studies have described multisensory perceptual benefits such as enhanced detection rates and faster reaction times in great detail. However, the neural circuits and mechanism underlying multisensory integration remain difficult to study in the primate brain. While rodents offer the advantage of a range of experimental methodologies to study the neural basis of multisensory processing, rodent studies are still limited due to the small number of available multisensory protocols. We here demonstrate the feasibility of an audio-visual stimulus detection task for rats, in which the animals detect lateralized uni- and multi-sensory stimuli in a two-response forced choice paradigm. We show that animals reliably learn and perform this task. Reaction times were significantly faster and behavioral performance levels higher in multisensory compared to unisensory conditions. This benefit was strongest for dim visual targets, in agreement with classical patterns of multisensory integration, and was specific to task-informative sounds, while uninformative sounds speeded reaction times with little costs for detection performance. Importantly, multisensory benefits for stimulus detection and reaction times appeared at different levels of task proficiency and training experience, suggesting distinct mechanisms inducing these two multisensory benefits. Our results demonstrate behavioral multisensory enhancement in rats in analogy to behavioral patterns known from other species, such as humans. In addition, our paradigm enriches the set of behavioral tasks on which future studies can rely, for example to combine behavioral measurements with imaging or pharmacological studies in the behaving animal or to study changes of integration properties in disease models
Recommended from our members
Sensory dominance and multisensory integration as screening tools in aging
Multisensory information typically confers neural and behavioural advantages over unisensory information. We used a simple audio-visual detection task to compare healthy young (HY), healthy older (HO) and mild-cognitive impairment (MCI) individuals. Neuropsychological tests assessed individuals' learning and memory impairments. First, we provide much-needed clarification regarding the presence of enhanced multisensory benefits in both healthily and abnormally aging individuals. The pattern of sensory dominance shifted with healthy and abnormal aging to favour a propensity of auditory-dominant behaviour (i.e., detecting sounds faster than flashes). Notably, multisensory benefits were larger only in healthy older than younger individuals who were also visually-dominant. Second, we demonstrate that the multisensory detection task offers benefits as a time- and resource-economic MCI screening tool. Receiver operating characteristic (ROC) analysis demonstrated that MCI diagnosis could be reliably achieved based on the combination of indices of multisensory integration together with indices of sensory dominance. Our findings showcase the importance of sensory profiles in determining multisensory benefits in healthy and abnormal aging. Crucially, our findings open an exciting possibility for multisensory detection tasks to be used as a cost-effective screening tool. These findings clarify relationships between multisensory and memory functions in aging, while offering new avenues for improved dementia diagnostics
Incidental learning in a multisensory environment across childhood
Multisensory information has been shown to modulate attention in infants and facilitate learning in
adults, by enhancing the amodal properties of a stimulus. However, it remains unclear whether this
translates to learning in a multisensory environment across middle childhood, and particularly in the
case of incidental learning. One hundred and eighty-one children aged between 6 and 10 years
participated in this study using a novel Multisensory Attention Learning Task (MALT). Participants
were asked to respond to the presence of a target stimulus whilst ignoring distractors. Correct target
selection resulted in the movement of the target exemplar to either the upper left or right screen
quadrant, according to category membership. Category membership was defined either by visual-only,
auditory-only or multisensory information. As early as 6 years of age, children demonstrated greater
performance on the incidental categorization task following exposure to multisensory audiovisual
cues compared to unisensory information. These findings provide important insight into the use of
multisensory information in learning, and particularly on incidental category learning. Implications
for the deployment of multisensory learning tasks within education across development will be
discussed
Incidental category learning and cognitive load in a multisensory environment across childhood
Broadbent, H.J., Osborne, T., Rea, M., Peng, A., Mareschal, D., and Kirkham, N.Z.
Multisensory information has been shown to facilitate learning (Bahrick & Lickliter, 2000;
Broadbent, White, Mareschal, & Kirkham, 2017; Jordan & Baker, 2011; Shams & Seitz, 2008).
However, although research has examined the modulating effect of unisensory and multisensory
distractors on multisensory processing, the extent to which a concurrent unisensory or multisensory
cognitive load task would interfere with or support multisensory learning remains unclear. This study
examined the role of concurrent task modality on incidental category learning in 6- to 10-year-olds.
Participants were engaged in a multisensory learning task whilst also performing either a unisensory
(visual or auditory only) or multisensory (audiovisual) concurrent task (CT). We found that engaging
in an auditory CT led to poorer performance on incidental category learning compared with an
audiovisual or visual CT, across groups. In 6-year-olds, category test performance was at chance in
the auditory-only CT condition, suggesting auditory concurrent tasks may interfere with learning in
younger children, but the addition of visual information may serve to focus attention. These findings
provide novel insight into the use of multisensory concurrent information on incidental learning.
Implications for the deployment of multisensory learning tasks within education across development
and developmental changes in modality dominance and ability to switch flexibly across modalities are
discussed.
Keywords: Multisensory Integration; Cognitive Development; Incidental Learning; Cognitive Loa
- …