505 research outputs found

    Single-trial multisensory memories affect later auditory and visual object discrimination.

    Get PDF
    Multisensory memory traces established via single-trial exposures can impact subsequent visual object recognition. This impact appears to depend on the meaningfulness of the initial multisensory pairing, implying that multisensory exposures establish distinct object representations that are accessible during later unisensory processing. Multisensory contexts may be particularly effective in influencing auditory discrimination, given the purportedly inferior recognition memory in this sensory modality. The possibility of this generalization and the equivalence of effects when memory discrimination was being performed in the visual vs. auditory modality were at the focus of this study. First, we demonstrate that visual object discrimination is affected by the context of prior multisensory encounters, replicating and extending previous findings by controlling for the probability of multisensory contexts during initial as well as repeated object presentations. Second, we provide the first evidence that single-trial multisensory memories impact subsequent auditory object discrimination. Auditory object discrimination was enhanced when initial presentations entailed semantically congruent multisensory pairs and was impaired after semantically incongruent multisensory encounters, compared to sounds that had been encountered only in a unisensory manner. Third, the impact of single-trial multisensory memories upon unisensory object discrimination was greater when the task was performed in the auditory vs. visual modality. Fourth, there was no evidence for correlation between effects of past multisensory experiences on visual and auditory processing, suggestive of largely independent object processing mechanisms between modalities. We discuss these findings in terms of the conceptual short term memory (CSTM) model and predictive coding. Our results suggest differential recruitment and modulation of conceptual memory networks according to the sensory task at hand

    Acceptance and adoption of biofortified crops in low- and middle-income countries

    Get PDF
    Biofortification of staple crops through conventional plant breeding, genetic engineering or agronomic approaches, is a promising strategy for increasing dietary nutrient density to improve human health. Successful implementation depends, amongst others, on willingness of consumers and farmers to accept the newly bred crop varieties

    Interaction of perceptual grouping and crossmodal temporal capture in tactile apparent-motion

    Get PDF
    Previous studies have shown that in tasks requiring participants to report the direction of apparent motion, task-irrelevant mono-beeps can "capture'' visual motion perception when the beeps occur temporally close to the visual stimuli. However, the contributions of the relative timing of multimodal events and the event structure, modulating uni- and/or crossmodal perceptual grouping, remain unclear. To examine this question and extend the investigation to the tactile modality, the current experiments presented tactile two-tap apparent-motion streams, with an SOA of 400 ms between successive, left-/right-hand middle-finger taps, accompanied by task-irrelevant, non-spatial auditory stimuli. The streams were shown for 90 seconds, and participants' task was to continuously report the perceived (left-or rightward) direction of tactile motion. In Experiment 1, each tactile stimulus was paired with an auditory beep, though odd-numbered taps were paired with an asynchronous beep, with audiotactile SOAs ranging from -75 ms to 75 ms. Perceived direction of tactile motion varied systematically with audiotactile SOA, indicative of a temporal-capture effect. In Experiment 2, two audiotactile SOAs-one short (75 ms), one long (325 ms)-were compared. The long-SOA condition preserved the crossmodal event structure (so the temporal-capture dynamics should have been similar to that in Experiment 1), but both beeps now occurred temporally close to the taps on one side (even-numbered taps). The two SOAs were found to produce opposite modulations of apparent motion, indicative of an influence of crossmodal grouping. In Experiment 3, only odd-numbered, but not even-numbered, taps were paired with auditory beeps. This abolished the temporal-capture effect and, instead, a dominant percept of apparent motion from the audiotactile side to the tactile-only side was observed independently of the SOA variation. These findings suggest that asymmetric crossmodal grouping leads to an attentional modulation of apparent motion, which inhibits crossmodal temporal-capture effects

    Multisensory causal inference in the brain

    Get PDF
    At any given moment, our brain processes multiple inputs from its different sensory modalities (vision, hearing, touch, etc.). In deciphering this array of sensory information, the brain has to solve two problems: (1) which of the inputs originate from the same object and should be integrated and (2) for the sensations originating from the same object, how best to integrate them. Recent behavioural studies suggest that the human brain solves these problems using optimal probabilistic inference, known as Bayesian causal inference. However, how and where the underlying computations are carried out in the brain have remained unknown. By combining neuroimaging-based decoding techniques and computational modelling of behavioural data, a new study now sheds light on how multisensory causal inference maps onto specific brain areas. The results suggest that the complexity of neural computations increases along the visual hierarchy and link specific components of the causal inference process with specific visual and parietal regions

    Biofortified yellow cassava and vitamin A status of Kenyan children: a randomized controlled trial.

    Get PDF
    BACKGROUND: Whereas conventional white cassava roots are devoid of provitamin A, biofortified yellow varieties are naturally rich in β-carotene, the primary provitamin A carotenoid. OBJECTIVE: We assessed the effect of consuming yellow cassava on serum retinol concentration in Kenyan schoolchildren with marginal vitamin A status. DESIGN: We randomly allocated 342 children aged 5-13 y to receive daily, 6 d/wk, for 18.5 wk 1) white cassava and placebo supplement (control group), 2) provitamin A-rich cassava (mean content: 1460 μg β-carotene/d) and placebo supplement (yellow cassava group), and 3) white cassava and β-carotene supplement (1053 μg/d; β-carotene supplement group). The primary outcome was serum retinol concentration; prespecified secondary outcomes were hemoglobin concentration and serum concentrations of β-carotene, retinol-binding protein, and prealbumin. Groups were compared by using ANCOVA, adjusting for inflammation, baseline serum concentrations of retinol and β-carotene, and stratified design. RESULTS: The baseline prevalence of serum retinol concentration <0.7 μmol/L and inflammation was 27% and 24%, respectively. For children in the control, yellow cassava, and β-carotene supplement groups, the mean daily intake of cassava was 378, 371, and 378 g, respectively, and the total daily supply of provitamin A and vitamin A from diet and supplements was equivalent to 22, 220, and 175 μg retinol, respectively. Both yellow cassava and β-carotene supplementation increased serum retinol concentration by 0.04 μmol/L (95% CI: 0.00, 0.07 μmol/L); correspondingly, serum β-carotene concentration increased by 524% (448%, 608%) and 166% (134%, 202%). We found no effect on hemoglobin concentration or serum concentrations of retinol-binding protein and prealbumin. CONCLUSIONS: In our study population, consumption of yellow cassava led to modest gains in serum retinol concentration and a large increase in β-carotene concentration. It can be an efficacious, new approach to improve vitamin A status. This study was registered with clinicaltrials.gov as NCT01614483

    Neural Architecture of Hunger-Dependent Multisensory Decision Making in C. elegans

    Get PDF
    Little is known about how animals integrate multiple sensory inputs in natural environments to balance avoidance of danger with approach to things of value. Furthermore, the mechanistic link between internal physiological state and threat-reward decision making remains poorly understood. Here we confronted C. elegans worms with the decision whether to cross a hyperosmotic barrier presenting the threat of desiccation to reach a source of food odor. We identified a specific interneuron that controls this decision via top-down extrasynaptic aminergic potentiation of the primary osmosensory neurons to increase their sensitivity to the barrier. We also establish that food deprivation increases the worm's willingness to cross the dangerous barrier by suppressing this pathway. These studies reveal a potentially general neural circuit architecture for internal state control of threat-reward decision making

    The COGs (context, object, and goals) in multisensory processing

    Get PDF
    Our understanding of how perception operates in real-world environments has been substantially advanced by studying both multisensory processes and “top-down” control processes influencing sensory processing via activity from higher-order brain areas, such as attention, memory, and expectations. As the two topics have been traditionally studied separately, the mechanisms orchestrating real-world multisensory processing remain unclear. Past work has revealed that the observer’s goals gate the influence of many multisensory processes on brain and behavioural responses, whereas some other multisensory processes might occur independently of these goals. Consequently, other forms of top-down control beyond goal dependence are necessary to explain the full range of multisensory effects currently reported at the brain and the cognitive level. These forms of control include sensitivity to stimulus context as well as the detection of matches (or lack thereof) between a multisensory stimulus and categorical attributes of naturalistic objects (e.g. tools, animals). In this review we discuss and integrate the existing findings that demonstrate the importance of such goal-, object- and context-based top-down control over multisensory processing. We then put forward a few principles emerging from this literature review with respect to the mechanisms underlying multisensory processing and discuss their possible broader implications

    Intermodal attention affects the processing of the temporal alignment of audiovisual stimuli

    Get PDF
    The temporal asynchrony between inputs to different sensory modalities has been shown to be a critical factor influencing the interaction between such inputs. We used scalp-recorded event-related potentials (ERPs) to investigate the effects of attention on the processing of audiovisual multisensory stimuli as the temporal asynchrony between the auditory and visual inputs varied across the audiovisual integration window (i.e., up to 125 ms). Randomized streams of unisensory auditory stimuli, unisensory visual stimuli, and audiovisual stimuli (consisting of the temporally proximal presentation of the visual and auditory stimulus components) were presented centrally while participants attended to either the auditory or the visual modality to detect occasional target stimuli in that modality. ERPs elicited by each of the contributing sensory modalities were extracted by signal processing techniques from the combined ERP waveforms elicited by the multisensory stimuli. This was done for each of the five different 50-ms subranges of stimulus onset asynchrony (SOA: e.g., V precedes A by 125–75 ms, by 75–25 ms, etc.). The extracted ERPs for the visual inputs of the multisensory stimuli were compared among each other and with the ERPs to the unisensory visual control stimuli, separately when attention was directed to the visual or to the auditory modality. The results showed that the attention effects on the right-hemisphere visual P1 was largest when auditory and visual stimuli were temporally aligned. In contrast, the N1 attention effect was smallest at this latency, suggesting that attention may play a role in the processing of the relative temporal alignment of the constituent parts of multisensory stimuli. At longer latencies an occipital selection negativity for the attended versus unattended visual stimuli was also observed, but this effect did not vary as a function of SOA, suggesting that by that latency a stable representation of the auditory and visual stimulus components has been established
    corecore