1,027 research outputs found

    Characterizing response behavior in multisensory perception with conflicting cues

    Get PDF
    We explore a recently proposed mixture model approach to understanding interactions between conflicting sensory cues. Alternative model formulations, differing in their sensory noise models and inference methods, are compared based on their fit to experimental data. Heavy-tailed sensory likelihoods yield a better description of the subjects' response behavior than standard Gaussian noise models. We study the underlying cause for this result, and then present several testable predictions of these models

    The effect of simultaneously presented words and auditory tones on visuomotor performance

    Get PDF
    The experiment reported here used a variation of the spatial cueing task to examine the effects of unimodal and bimodal attention-orienting primes on target identification latencies and eye gaze movements. The primes were a nonspatial auditory tone and words known to drive attention consistent with the dominant writing and reading direction, as well as introducing a semantic, temporal bias (past–future) on the horizontal dimension. As expected, past-related (visual) word primes gave rise to shorter response latencies on the left hemifield and future-related words on the right. This congruency effect was differentiated by an asymmetric performance on the right space following future words and driven by the left-to-right trajectory of scanning habits that facilitated search times and eye gaze movements to lateralized targets. Auditory tone prime alone acted as an alarm signal, boosting visual search and reducing response latencies. Bimodal priming, i.e., temporal visual words paired with the auditory tone, impaired performance by delaying visual attention and response times relative to the unimodal visual word condition. We conclude that bimodal primes were no more effective in capturing participants’ spatial attention than the unimodal auditory and visual primes. Their contribution to the literature on multisensory integration is discussed.info:eu-repo/semantics/acceptedVersio

    Multisensory Integration in Self Motion Perception

    Get PDF
    Self motion perception involves the integration of visual, vestibular, somatosensory and motor signals. This article reviews the findings from single unit electrophysiology, functional and structural magnetic resonance imaging and psychophysics to present an update on how the human and non-human primate brain integrates multisensory information to estimate one's position and motion in space. The results indicate that there is a network of regions in the non-human primate and human brain that processes self motion cues from the different sense modalities

    Multisensory bayesian inference depends on synapse maturation during training: Theoretical analysis and neural modeling implementation

    Get PDF
    Recent theoretical and experimental studies suggest that in multisensory conditions, the brain performs a near-optimal Bayesian estimate of external events, giving more weight to the more reliable stimuli. However, the neural mechanisms responsible for this behavior, and its progressive maturation in a multisensory environment, are still insufficiently understood. The aim of this letter is to analyze this problem with a neural network model of audiovisual integration, based on probabilistic population coding-the idea that a population of neurons can encode probability functions to perform Bayesian inference. The model consists of two chains of unisensory neurons (auditory and visual) topologically organized. They receive the corresponding input through a plastic receptive field and reciprocally exchange plastic cross-modal synapses, which encode the spatial co-occurrence of visual-auditory inputs. A third chain of multisensory neurons performs a simple sum of auditory and visual excitations. Thework includes a theoretical part and a computer simulation study. We show how a simple rule for synapse learning (consisting of Hebbian reinforcement and a decay term) can be used during training to shrink the receptive fields and encode the unisensory likelihood functions. Hence, after training, each unisensory area realizes a maximum likelihood estimate of stimulus position (auditory or visual). In crossmodal conditions, the same learning rule can encode information on prior probability into the cross-modal synapses. Computer simulations confirm the theoretical results and show that the proposed network can realize a maximum likelihood estimate of auditory (or visual) positions in unimodal conditions and a Bayesian estimate, with moderate deviations from optimality, in cross-modal conditions. Furthermore, the model explains the ventriloquism illusion and, looking at the activity in the multimodal neurons, explains the automatic reweighting of auditory and visual inputs on a trial-by-trial basis, according to the reliability of the individual cues

    Behavioral, perceptual, and neural alterations in sensory and multisensory function in autism spectrum disorder.

    Get PDF
    Although sensory processing challenges have been noted since the first clinical descriptions of autism, it has taken until the release of the fifth edition of the Diagnostic and Statistical Manual of Mental Disorders (DSM-5) in 2013 for sensory problems to be included as part of the core symptoms of autism spectrum disorder (ASD) in the diagnostic profile. Because sensory information forms the building blocks for higher-order social and cognitive functions, we argue that sensory processing is not only an additional piece of the puzzle, but rather a critical cornerstone for characterizing and understanding ASD. In this review we discuss what is currently known about sensory processing in ASD, how sensory function fits within contemporary models of ASD, and what is understood about the differences in the underlying neural processing of sensory and social communication observed between individuals with and without ASD. In addition to highlighting the sensory features associated with ASD, we also emphasize the importance of multisensory processing in building perceptual and cognitive representations, and how deficits in multisensory integration may also be a core characteristic of ASD

    Shaping the auditory peripersonal space with motor planning in immersive virtual reality

    Get PDF
    Immersive audio technologies require personalized binaural synthesis through headphones to provide perceptually plausible virtual and augmented reality (VR/AR) simulations. We introduce and apply for the first time in VR contexts the quantitative measure called premotor reaction time (pmRT) for characterizing sonic interactions between humans and the technology through motor planning. In the proposed basic virtual acoustic scenario, listeners are asked to react to a virtual sound approaching from different directions and stopping at different distances within their peripersonal space (PPS). PPS is highly sensitive to embodied and environmentally situated interactions, anticipating the motor system activation for a prompt preparation for action. Since immersive VR applications benefit from spatial interactions, modeling the PPS around the listeners is crucial to reveal individual behaviors and performances. Our methodology centered around the pmRT is able to provide a compact description and approximation of the spatiotemporal PPS processing and boundaries around the head by replicating several well-known neurophysiological phenomena related to PPS, such as auditory asymmetry, front/back calibration and confusion, and ellipsoidal action fields

    Space and time in the human brain

    Get PDF

    Multimodal Sensory Integration for Perception and Action in High Functioning Children with Autism Spectrum Disorder

    Get PDF
    Movement disorders are the earliest observed features of autism spectrum disorder (ASD) present in infancy. Yet we do not understand the neural basis for impaired goal-directed movements in this population. To reach for an object, it is necessary to perceive the state of the arm and the object using multiple sensory modalities (e.g. vision, proprioception), to integrate those sensations into a motor plan, to execute the plan, and to update the plan based on the sensory consequences of action. In this dissertation, I present three studies in which I recorded hand paths of children with ASD and typically developing (TD) controls as they grasped the handle of a robotic device to control a cursor displayed on a video screen. First, participants performed discrete and continuous movements to capture targets. Cursor feedback was perturbed from the hand\u27s actual position to introduce visuo-spatial conflict between sensory and proprioceptive feedback. Relative to controls, children with ASD made greater errors, consistent with deficits of sensorimotor adaptive and strategic compensations. Second, participants performed a two-interval forced-choice discrimination task in which they perceived two movements of the visual cursor and/or the robot handle and then indicated which of the two movements was more curved. Children with ASD were impaired in their ability to discriminate movement kinematics when provided visual and proprioceptive information simultaneously, suggesting deficits of visuo-proprioceptive integration. Finally, participants made goal-directed reaching movements against a load while undergoing simultaneous functional magnetic resonance imaging (MRI). The load remained constant (predictable) within an initial block of trials and then varied randomly within four additional blocks. Children with ASD exhibited greater movement variability compared to controls during both constant and randomly-varying loads. MRI analysis identified marked differences in the extent and intensity of the neural activities supporting goal-directed reaching in children with ASD compared to TD children in both environmental conditions. Taken together, the three studies revealed deficits of multimodal sensory integration in children with ASD during perception and execution of goal-directed movements and ASD-related motor performance deficits have a telltale neural signature, as revealed by functional MR imaging
    • 

    corecore