7,894 research outputs found

    Neural Substrates of Reliability-Weighted Visual-Tactile Multisensory Integration

    Get PDF
    As sensory systems deteriorate in aging or disease, the brain must relearn the appropriate weights to assign each modality during multisensory integration. Using blood-oxygen level dependent functional magnetic resonance imaging of human subjects, we tested a model for the neural mechanisms of sensory weighting, termed “weighted connections.” This model holds that the connection weights between early and late areas vary depending on the reliability of the modality, independent of the level of early sensory cortex activity. When subjects detected viewed and felt touches to the hand, a network of brain areas was active, including visual areas in lateral occipital cortex, somatosensory areas in inferior parietal lobe, and multisensory areas in the intraparietal sulcus (IPS). In agreement with the weighted connection model, the connection weight measured with structural equation modeling between somatosensory cortex and IPS increased for somatosensory-reliable stimuli, and the connection weight between visual cortex and IPS increased for visual-reliable stimuli. This double dissociation of connection strengths was similar to the pattern of behavioral responses during incongruent multisensory stimulation, suggesting that weighted connections may be a neural mechanism for behavioral reliability weighting

    Backwards is the way forward: feedback in the cortical hierarchy predicts the expected future

    Get PDF
    Clark offers a powerful description of the brain as a prediction machine, which offers progress on two distinct levels. First, on an abstract conceptual level, it provides a unifying framework for perception, action, and cognition (including subdivisions such as attention, expectation, and imagination). Second, hierarchical prediction offers progress on a concrete descriptive level for testing and constraining conceptual elements and mechanisms of predictive coding models (estimation of predictions, prediction errors, and internal models)

    The role of the right temporoparietal junction in perceptual conflict: detection or resolution?

    Get PDF
    The right temporoparietal junction (rTPJ) is a polysensory cortical area that plays a key role in perception and awareness. Neuroimaging evidence shows activation of rTPJ in intersensory and sensorimotor conflict situations, but it remains unclear whether this activity reflects detection or resolution of such conflicts. To address this question, we manipulated the relationship between touch and vision using the so-called mirror-box illusion. Participants' hands lay on either side of a mirror, which occluded their left hand and reflected their right hand, but created the illusion that they were looking directly at their left hand. The experimenter simultaneously touched either the middle (D3) or the ring finger (D4) of each hand. Participants judged, which finger was touched on their occluded left hand. The visual stimulus corresponding to the touch on the right hand was therefore either congruent (same finger as touch) or incongruent (different finger from touch) with the task-relevant touch on the left hand. Single-pulse transcranial magnetic stimulation (TMS) was delivered to the rTPJ immediately after touch. Accuracy in localizing the left touch was worse for D4 than for D3, particularly when visual stimulation was incongruent. However, following TMS, accuracy improved selectively for D4 in incongruent trials, suggesting that the effects of the conflicting visual information were reduced. These findings suggest a role of rTPJ in detecting, rather than resolving, intersensory conflict

    Sensory and cognitive factors in multi-digit touch, and its integration with vision

    Get PDF
    Every tactile sensation – an itch, a kiss, a hug, a pen gripped between fingers, a soft fabric brushing against the skin – is experienced in relation to the body. Normally, they occur somewhere on the body’s surface – they have spatiality. This sense of spatiality is what allows us to perceive a partner’s caress in terms of its changing location on the skin, its movement direction, speed, and extent. How this spatiality arises and how it is experienced is a thriving research topic, compelled by growing interest in the nature of tactile experiences from product design to brain-machine interfaces. The present thesis adds to this flourishing area of research by examining the unified spatial quality of touch. How does distinct spatial information converge from separate areas of the body surface to give rise to our normal unified experience of touch? After explaining the importance of this question in Chapter 1, a novel paradigm to tackle this problem will be presented, whereby participants are asked to estimate the average direction of two stimuli that are simultaneously moved across two different fingerpads. This paradigm is a laboratory analogue of the more ecological task of representing the overall movement of an object held between multiple fingers. An EEG study in Chapter 2 will reveal a brain mechanism that could facilitate such aggregated perception. Next, by characterising participants’ performance not just in terms of error rates, but by considering perceptual sensitivity, bias, precision, and signal weighting, a series of psychophysical experiments will show that this aggregation ability differs for within- and between-hand perception (Chapter 3), is independent from somatotopically-defined circuitry (Chapter 4) and arises after proprioceptive input about hand posture is accounted for (Chapter 5). Finally, inspired by the demand for integrated tactile and visual experience in virtual reality and the potential of tactile interface to aid navigation, Chapter 6 will examine the contribution of tactile spatiality on visual spatial experience. Ultimately, the present thesis will reveal sensory factors that limit precise representation of concurrently occurring dynamic tactile events. It will point to cognitive strategies the brain may employ to overcome those limitations to tactually perceive coherent objects. As such, this thesis advances somatosensory research beyond merely examining the selectivity to and discrimination between experienced tactile inputs, to considering the unified experience of touch despite distinct stimulus elements. The findings also have practical implications for the design of functional tactile interfaces

    Mechanisms of interpersonal sway synchrony and stability

    Get PDF
    Here we explain the neural and mechanical mechanisms responsible for synchronizing sway and improving postural control during physical contact with another standing person. Postural control processes were modelled using an inverted pendulum under continuous feedback control. Interpersonal interactions were simulated either by coupling the sensory feedback loops or by physically coupling the pendulums with a damped spring. These simulations precisely recreated the timing and magnitude of sway interactions observed empirically. Effects of firmly grasping another person's shoulder were explained entirely by the mechanical linkage. This contrasted with light touch and/or visual contact, which were explained by a sensory weighting phenomenon; each person's estimate of upright was based on a weighted combination of veridical sensory feedback combined with a small contribution from their partner. Under these circumstances, the model predicted reductions in sway even without the need to distinguish between self and partner motion. Our findings explain the seemingly paradoxical observation that touching a swaying person can improve postural control.This work was supported by two BBSRC grants (BB/100579X/1 and an Industry Interchange Award)

    Bodily awareness and novel multisensory features

    Get PDF
    According to the decomposition thesis, perceptual experiences resolve without remainder into their different modality-specific components. Contrary to this view, I argue that certain cases of multisensory integration give rise to experiences representing features of a novel type. Through the coordinated use of bodily awareness—understood here as encompassing both proprioception and kinaesthesis—and the exteroceptive sensory modalities, one becomes perceptually responsive to spatial features whose instances couldn’t be represented by any of the contributing modalities functioning in isolation. I develop an argument for this conclusion focusing on two cases: 3D shape perception in haptic touch and experiencing an object’s egocentric location in crossmodally accessible, environmental space

    Causal inference in multisensory perception and the brain

    Get PDF
    To build coherent and veridical multisensory representations of the environment, human observers consider the causal structure of multisensory signals: If they infer a common source of the signals, observers integrate them weighted by their reliability. Otherwise, they segregate the signals. Generally, observers infer a common source if the signals correspond structurally and spatiotemporally. In six projects, the current PhD thesis investigated this causal inference model with the help of audiovisual spatial signals presented to human observers in a ventriloquist paradigm. A first psychophysical study showed that sensory reliability determines causal inference via two mechanisms: Sensory reliability modulates how observers infer the causal structure from spatial signal disparity. Further, sensory reliability determines the weight of audiovisual signals if observers integrate the signals under assumption of a common source. Using multivariate decoding of fMRI signals, three PhD projects revealed that auditory and visual cortical hierarchies jointly implement causal inference. Specific regions of the hierarchies represented constituent spatial estimates of the causal inference model. In line with this model, anterior regions of intraparietal sulcus (IPS) represent audiovisual signals dependent on visual reliability, task-relevance, and spatial disparity of the signals. However, even in case of small signal discrepancies suggesting a common source, reliability-weighting in IPS was suboptimal as compared to a Maximum Estimation Likelihood model. By temporally manipulating visual reliability, the fifth PhD project demonstrated that human observers learn sensory reliability from current and past signals in order to weight audiovisual signals, consistent with a Bayesian learner. Finally, the sixth project showed that if visual flashes were rendered unaware by continuous flash suppression, the visual bias of the perceived auditory location was strongly reduced but still significant. The reduced ventriloquist effect was presumably mediated by the drop of visual reliability accompanying perceptual unawareness. In conclusion, the PhD thesis suggests that human observers integrate multisensory signals according to their causal structure and temporal regularity: They integrate the signals if a common source is likely by weighting them proportional to the reliability which they learnt from the signals’ history. Crucially, specific regions of cortical hierarchies jointly implement these multisensory processes

    Embodied Precision : Intranasal Oxytocin Modulates Multisensory Integration

    Get PDF
    © 2018 Massachusetts Institute of Technology.Multisensory integration processes are fundamental to our sense of self as embodied beings. Bodily illusions, such as the rubber hand illusion (RHI) and the size-weight illusion (SWI), allow us to investigate how the brain resolves conflicting multisensory evidence during perceptual inference in relation to different facets of body representation. In the RHI, synchronous tactile stimulation of a participant's hidden hand and a visible rubber hand creates illusory body ownership; in the SWI, the perceived size of the body can modulate the estimated weight of external objects. According to Bayesian models, such illusions arise as an attempt to explain the causes of multisensory perception and may reflect the attenuation of somatosensory precision, which is required to resolve perceptual hypotheses about conflicting multisensory input. Recent hypotheses propose that the precision of sensorimotor representations is determined by modulators of synaptic gain, like dopamine, acetylcholine, and oxytocin. However, these neuromodulatory hypotheses have not been tested in the context of embodied multisensory integration. The present, double-blind, placebo-controlled, crossover study ( N = 41 healthy volunteers) aimed to investigate the effect of intranasal oxytocin (IN-OT) on multisensory integration processes, tested by means of the RHI and the SWI. Results showed that IN-OT enhanced the subjective feeling of ownership in the RHI, only when synchronous tactile stimulation was involved. Furthermore, IN-OT increased an embodied version of the SWI (quantified as estimation error during a weight estimation task). These findings suggest that oxytocin might modulate processes of visuotactile multisensory integration by increasing the precision of top-down signals against bottom-up sensory input.Peer reviewedFinal Accepted Versio
    • …
    corecore