5,477 research outputs found

    Multisensory causal inference in the brain

    Get PDF
    At any given moment, our brain processes multiple inputs from its different sensory modalities (vision, hearing, touch, etc.). In deciphering this array of sensory information, the brain has to solve two problems: (1) which of the inputs originate from the same object and should be integrated and (2) for the sensations originating from the same object, how best to integrate them. Recent behavioural studies suggest that the human brain solves these problems using optimal probabilistic inference, known as Bayesian causal inference. However, how and where the underlying computations are carried out in the brain have remained unknown. By combining neuroimaging-based decoding techniques and computational modelling of behavioural data, a new study now sheds light on how multisensory causal inference maps onto specific brain areas. The results suggest that the complexity of neural computations increases along the visual hierarchy and link specific components of the causal inference process with specific visual and parietal regions

    Interaction of perceptual grouping and crossmodal temporal capture in tactile apparent-motion

    Get PDF
    Previous studies have shown that in tasks requiring participants to report the direction of apparent motion, task-irrelevant mono-beeps can "capture'' visual motion perception when the beeps occur temporally close to the visual stimuli. However, the contributions of the relative timing of multimodal events and the event structure, modulating uni- and/or crossmodal perceptual grouping, remain unclear. To examine this question and extend the investigation to the tactile modality, the current experiments presented tactile two-tap apparent-motion streams, with an SOA of 400 ms between successive, left-/right-hand middle-finger taps, accompanied by task-irrelevant, non-spatial auditory stimuli. The streams were shown for 90 seconds, and participants' task was to continuously report the perceived (left-or rightward) direction of tactile motion. In Experiment 1, each tactile stimulus was paired with an auditory beep, though odd-numbered taps were paired with an asynchronous beep, with audiotactile SOAs ranging from -75 ms to 75 ms. Perceived direction of tactile motion varied systematically with audiotactile SOA, indicative of a temporal-capture effect. In Experiment 2, two audiotactile SOAs-one short (75 ms), one long (325 ms)-were compared. The long-SOA condition preserved the crossmodal event structure (so the temporal-capture dynamics should have been similar to that in Experiment 1), but both beeps now occurred temporally close to the taps on one side (even-numbered taps). The two SOAs were found to produce opposite modulations of apparent motion, indicative of an influence of crossmodal grouping. In Experiment 3, only odd-numbered, but not even-numbered, taps were paired with auditory beeps. This abolished the temporal-capture effect and, instead, a dominant percept of apparent motion from the audiotactile side to the tactile-only side was observed independently of the SOA variation. These findings suggest that asymmetric crossmodal grouping leads to an attentional modulation of apparent motion, which inhibits crossmodal temporal-capture effects

    Who is that? Brain networks and mechanisms for identifying individuals

    Get PDF
    Social animals can identify conspecifics by many forms of sensory input. However, whether the neuronal computations that support this ability to identify individuals rely on modality-independent convergence or involve ongoing synergistic interactions along the multiple sensory streams remains controversial. Direct neuronal measurements at relevant brain sites could address such questions, but this requires better bridging the work in humans and animal models. Here, we overview recent studies in nonhuman primates on voice and face identity-sensitive pathways and evaluate the correspondences to relevant findings in humans. This synthesis provides insights into converging sensory streams in the primate anterior temporal lobe (ATL) for identity processing. Furthermore, we advance a model and suggest how alternative neuronal mechanisms could be tested

    Cortical region interactions and the functional role of apical dendrites

    Get PDF
    The basal and distal apical dendrites of pyramidal cells occupy distinct cortical layers and are targeted by axons originating in different cortical regions. Hence, apical and basal dendrites receive information from distinct sources. Physiological evidence suggests that this anatomically observed segregation of input sources may have functional significance. This possibility has been explored in various connectionist models that employ neurons with functionally distinct apical and basal compartments. A neuron in which separate sets of inputs can be integrated independently has the potential to operate in a variety of ways which are not possible for the conventional model of a neuron in which all inputs are treated equally. This article thus considers how functionally distinct apical and basal dendrites can contribute to the information processing capacities of single neurons and, in particular, how information from different cortical regions could have disparate affects on neural activity and learning
    • …
    corecore