285 research outputs found

    Multisensory Congruency as a Mechanism for Attentional Control over Perceptual Selection

    Get PDF
    The neural mechanisms underlying attentional selection of competing neural signals for awareness remains an unresolved issue. We studied attentional selection, using perceptually ambiguous stimuli in a novel multisensory paradigm that combined competing auditory and competing visual stimuli. We demonstrate that the ability to select, and attentively hold, one of the competing alternatives in either sensory modality is greatly enhanced when there is a matching cross-modal stimulus. Intriguingly, this multimodal enhancement of attentional selection seems to require a conscious act of attention, as passively experiencing the multisensory stimuli did not enhance control over the stimulus. We also demonstrate that congruent auditory or tactile information, and combined auditory–tactile information, aids attentional control over competing visual stimuli and visa versa. Our data suggest a functional role for recently found neurons that combine voluntarily initiated attentional functions across sensory modalities. We argue that these units provide a mechanism for structuring multisensory inputs that are then used to selectively modulate early (unimodal) cortical processing, boosting the gain of task-relevant features for willful control over perceptual awareness

    Attentional Modulation of Binocular Rivalry

    Get PDF
    Ever since Wheatstone initiated the scientific study of binocular rivalry, it has been debated whether the phenomenon is under attentional control. In recent years, the issue of attentional modulation of binocular rivalry has seen a revival. Here we review the classical studies as well as recent advances in the study of attentional modulation of binocular rivalry. We show that (1) voluntary control over binocular rivalry is possible, yet limited, (2) both endogenous and exogenous attention influence perceptual dominance during rivalry, (3) diverting attention from rival displays does not arrest perceptual alternations, and that (4) rival targets by themselves can also attract attention. From a theoretical perspective, we suggest that attention affects binocular rivalry by modulating the effective contrast of the images in competition. This contrast enhancing effect of top-down attention is counteracted by a response attenuating effect of neural adaptation at early levels of visual processing, which weakens the response to the dominant image. Moreover, we conclude that although frontal and parietal brain areas involved in both binocular rivalry and visual attention overlap, an adapting reciprocal inhibition arrangement at early visual cortex is sufficient to trigger switches in perceptual dominance independently of a higher-level “selection” mechanisms. Both of these processes are reciprocal and therefore self-balancing, with the consequence that complete attentional control over binocular rivalry can never be realized

    Reducing bias in auditory duration reproduction by integrating the reproduced signal

    Get PDF
    Duration estimation is known to be far from veridical and to differ for sensory estimates and motor reproduction. To investigate how these differential estimates are integrated for estimating or reproducing a duration and to examine sensorimotor biases in duration comparison and reproduction tasks, we compared estimation biases and variances among three different duration estimation tasks: perceptual comparison, motor reproduction, and auditory reproduction (i.e. a combined perceptual-motor task). We found consistent overestimation in both motor and perceptual-motor auditory reproduction tasks, and the least overestimation in the comparison task. More interestingly, compared to pure motor reproduction, the overestimation bias was reduced in the auditory reproduction task, due to the additional reproduced auditory signal. We further manipulated the signal-to-noise ratio (SNR) in the feedback/comparison tones to examine the changes in estimation biases and variances. Considering perceptual and motor biases as two independent components, we applied the reliability-based model, which successfully predicted the biases in auditory reproduction. Our findings thus provide behavioral evidence of how the brain combines motor and perceptual information together to reduce duration estimation biases and improve estimation reliability

    Combining eye and hand in search is suboptimal

    Get PDF
    When performing everyday tasks, we often move our eyes and hand together: we look where we are reaching in order to better guide the hand. This coordinated pattern with the eye leading the hand is presumably optimal behaviour. But eyes and hands can move to different locations if they are involved in different tasks. To find out whether this leads to optimal performance, we studied the combination of visual and haptic search. We asked ten participants to perform a combined visual and haptic search for a target that was present in both modalities and compared their search times to those on visual only and haptic only search tasks. Without distractors, search times were faster for visual search than for haptic search. With many visual distractors, search times were longer for visual than for haptic search. For the combined search, performance was poorer than the optimal strategy whereby each modality searched a different part of the display. The results are consistent with several alternative accounts, for instance with vision and touch searching independently at the same time

    Eye rivalry and object rivalry in the intact and split-brain

    Get PDF
    Both the eye of origin and the images themselves have been found to rival during binocular rivalry. We presented traditional binocular rivalry stimuli (face to one eye, house to the other) and Diaz-Caneja stimuli (half of each image to each eye) centrally to both a split-brain participant and a control group. With traditional rivalry stimuli both the split-brain participant and age-matched controls perceived more coherent percepts (synchronised across the hemifields) than non-synchrony, but our split-brain participant perceived more non-synchrony than our controls. For rival stimuli in the Diaz-Caneja presentation condition, object rivalry gave way to eye rivalry with all participants reporting more non-synchrony than coherent percepts. We have shown that splitting the stimuli across the hemifields between the eyes leads to greater eye than object rivalry, but that when traditional rival stimuli are split as the result of the severed corpus callosum, traditional rivalry persists but to a lesser extent than in the intact brain. These results suggest that communication between the early visual areas is not essential for synchrony in traditional rivalry stimuli, and that other routes for interhemispheric interactions such as subcortical connections may mediate rivalry in a traditional binocular rivalry condition

    Distortions of Subjective Time Perception Within and Across Senses

    Get PDF
    Background: The ability to estimate the passage of time is of fundamental importance for perceptual and cognitive processes. One experience of time is the perception of duration, which is not isomorphic to physical duration and can be distorted by a number of factors. Yet, the critical features generating these perceptual shifts in subjective duration are not understood. Methodology/Findings: We used prospective duration judgments within and across sensory modalities to examine the effect of stimulus predictability and feature change on the perception of duration. First, we found robust distortions of perceived duration in auditory, visual and auditory-visual presentations despite the predictability of the feature changes in the stimuli. For example, a looming disc embedded in a series of steady discs led to time dilation, whereas a steady disc embedded in a series of looming discs led to time compression. Second, we addressed whether visual (auditory) inputs could alter the perception of duration of auditory (visual) inputs. When participants were presented with incongruent audio-visual stimuli, the perceived duration of auditory events could be shortened or lengthened by the presence of conflicting visual information; however, the perceived duration of visual events was seldom distorted by the presence of auditory information and was never perceived shorter than their actual durations. Conclusions/Significance: These results support the existence of multisensory interactions in the perception of duration and, importantly, suggest that vision can modify auditory temporal perception in a pure timing task. Insofar as distortions in subjective duration can neither be accounted for by the unpredictability of an auditory, visual or auditory-visual event, we propose that it is the intrinsic features of the stimulus that critically affect subjective time distortions

    Dynamics of temporally interleaved percept-choice sequences: interaction via adaptation in shared neural populations

    Get PDF
    At the onset of visually ambiguous or conflicting stimuli, our visual system quickly ‘chooses’ one of the possible percepts. Interrupted presentation of the same stimuli has revealed that each percept-choice depends strongly on the history of previous choices and the duration of the interruptions. Recent psychophysics and modeling has discovered increasingly rich dynamical structure in such percept-choice sequences, and explained or predicted these patterns in terms of simple neural mechanisms: fast cross-inhibition and slow shunting adaptation that also causes a near-threshold facilitatory effect. However, we still lack a clear understanding of the dynamical interactions between two distinct, temporally interleaved, percept-choice sequences—a type of experiment that probes which feature-level neural network connectivity and dynamics allow the visual system to resolve the vast ambiguity of everyday vision. Here, we fill this gap. We first show that a simple column-structured neural network captures the known phenomenology, and then identify and analyze the crucial underlying mechanism via two stages of model-reduction: A 6-population reduction shows how temporally well-separated sequences become coupled via adaptation in neurons that are shared between the populations driven by either of the two sequences. The essential dynamics can then be reduced further, to a set of iterated adaptation-maps. This enables detailed analysis, resulting in the prediction of phase-diagrams of possible sequence-pair patterns and their response to perturbations. These predictions invite a variety of future experiments

    A search asymmetry for interocular conflict

    Get PDF
    When two different images are presented to the two eyes, the percept will alternate between the images (a phenomenon called binocular rivalry). In the present study, we investigate the degree to which such interocular conflict is conspicuous. By using a visual search task, we show that search for interocular conflict is near efficient (15 ms/item) and can lead to a search asymmetry, depending on the contrast in the display. We reconcile our findings with those of Wolfe and Franzel (1988), who reported inefficient search for interocular conflict (26 ms/item) and found no evidence for a search asymmetry. In addition, we provide evidence for the suggestion that differences in search for interocular conflict are contingent on the degree of abnormal fusion of the dissimilar images

    An analysis of the time course of attention in preview search.

    Get PDF
    We used a probe dot procedure to examine the time course of attention in preview search (Watson and Humphreys, 1997). Participants searched for an outline red vertical bar among other new red horizontal bars and old green vertical bars, superimposed on a blue background grid. Following the reaction time response for search, the participants had to decide whether a probe dot had briefly been presented. Previews appeared for 1,000 msec and were immediately followed by search displays. In Experiment 1, we demonstrated a standard preview benefit relative to a conjunction search baseline. In Experiment 2, search was combined with the probe task. Probes were more difficult to detect when they were presented 1,200 msec, relative to 800 msec, after the preview, but at both intervals detection of probes at the locations of old distractors was harder than detection on new distractors or at neutral locations. Experiment 3A demonstrated that there was no difference in the detection of probes at old, neutral, and new locations when probe detection was the primary task and there was also no difference when all of the shapes appeared simultaneously in conjunction search (Experiment 3B). In a final experiment (Experiment 4), we demonstrated that detection on old items was facilitated (relative to neutral locations and probes at the locations of new distractors) when the probes appeared 200 msec after previews, whereas there was worse detection on old items when the probes followed 800 msec after previews. We discuss the results in terms of visual marking and attention capture processes in visual search

    Efficient Visual Search from Synchronized Auditory Signals Requires Transient Audiovisual Events

    Get PDF
    BACKGROUND: A prevailing view is that audiovisual integration requires temporally coincident signals. However, a recent study failed to find any evidence for audiovisual integration in visual search even when using synchronized audiovisual events. An important question is what information is critical to observe audiovisual integration. METHODOLOGY/PRINCIPAL FINDINGS: Here we demonstrate that temporal coincidence (i.e., synchrony) of auditory and visual components can trigger audiovisual interaction in cluttered displays and consequently produce very fast and efficient target identification. In visual search experiments, subjects found a modulating visual target vastly more efficiently when it was paired with a synchronous auditory signal. By manipulating the kind of temporal modulation (sine wave vs. square wave vs. difference wave; harmonic sine-wave synthesis; gradient of onset/offset ramps) we show that abrupt visual events are required for this search efficiency to occur, and that sinusoidal audiovisual modulations do not support efficient search. CONCLUSIONS/SIGNIFICANCE: Thus, audiovisual temporal alignment will only lead to benefits in visual search if the changes in the component signals are both synchronized and transient. We propose that transient signals are necessary in synchrony-driven binding to avoid spurious interactions with unrelated signals when these occur close together in time
    corecore