479 research outputs found

    Direct evidence for encoding of motion streaks in human visual cortex

    No full text
    Temporal integration in the visual system causes fast-moving objects to generate static, oriented traces ('motion streaks'), which could be used to help judge direction of motion. While human psychophysics and single-unit studies in non-human primates are consistent with this hypothesis, direct neural evidence from the human cortex is still lacking. First, we provide psychophysical evidence that faster and slower motions are processed by distinct neural mechanisms: faster motion raised human perceptual thresholds for static orientations parallel to the direction of motion, whereas slower motion raised thresholds for orthogonal orientations. We then used functional magnetic resonance imaging to measure brain activity while human observers viewed either fast ('streaky') or slow random dot stimuli moving in different directions, or corresponding static-oriented stimuli. We found that local spatial patterns of brain activity in early retinotopic visual cortex reliably distinguished between static orientations. Critically, a multivariate pattern classifier trained on brain activity evoked by these static stimuli could then successfully distinguish the direction of fast ('streaky') but not slow motion. Thus, signals encoding static-oriented streak information are present in human early visual cortex when viewing fast motion. These experiments show that motion streaks are present in the human visual system for faster motion.This work was supported by the Wellcome Trust (G.R., D.S.S.), the European Union ‘Mindbridge’ project (B.B.), the Australian Federation of Graduate Women Tempe Mann Scholarship (D.A.), the University of Sydney Campbell Perry Travel Fellowship (D.A.) and the Brain Research Trust (C.K.)

    The Ventriloquist Effect Results from Near-Optimal Bimodal Integration

    Get PDF

    Reducing bias in auditory duration reproduction by integrating the reproduced signal

    Get PDF
    Duration estimation is known to be far from veridical and to differ for sensory estimates and motor reproduction. To investigate how these differential estimates are integrated for estimating or reproducing a duration and to examine sensorimotor biases in duration comparison and reproduction tasks, we compared estimation biases and variances among three different duration estimation tasks: perceptual comparison, motor reproduction, and auditory reproduction (i.e. a combined perceptual-motor task). We found consistent overestimation in both motor and perceptual-motor auditory reproduction tasks, and the least overestimation in the comparison task. More interestingly, compared to pure motor reproduction, the overestimation bias was reduced in the auditory reproduction task, due to the additional reproduced auditory signal. We further manipulated the signal-to-noise ratio (SNR) in the feedback/comparison tones to examine the changes in estimation biases and variances. Considering perceptual and motor biases as two independent components, we applied the reliability-based model, which successfully predicted the biases in auditory reproduction. Our findings thus provide behavioral evidence of how the brain combines motor and perceptual information together to reduce duration estimation biases and improve estimation reliability

    Combining eye and hand in search is suboptimal

    Get PDF
    When performing everyday tasks, we often move our eyes and hand together: we look where we are reaching in order to better guide the hand. This coordinated pattern with the eye leading the hand is presumably optimal behaviour. But eyes and hands can move to different locations if they are involved in different tasks. To find out whether this leads to optimal performance, we studied the combination of visual and haptic search. We asked ten participants to perform a combined visual and haptic search for a target that was present in both modalities and compared their search times to those on visual only and haptic only search tasks. Without distractors, search times were faster for visual search than for haptic search. With many visual distractors, search times were longer for visual than for haptic search. For the combined search, performance was poorer than the optimal strategy whereby each modality searched a different part of the display. The results are consistent with several alternative accounts, for instance with vision and touch searching independently at the same time

    Interaction of perceptual grouping and crossmodal temporal capture in tactile apparent-motion

    Get PDF
    Previous studies have shown that in tasks requiring participants to report the direction of apparent motion, task-irrelevant mono-beeps can "capture'' visual motion perception when the beeps occur temporally close to the visual stimuli. However, the contributions of the relative timing of multimodal events and the event structure, modulating uni- and/or crossmodal perceptual grouping, remain unclear. To examine this question and extend the investigation to the tactile modality, the current experiments presented tactile two-tap apparent-motion streams, with an SOA of 400 ms between successive, left-/right-hand middle-finger taps, accompanied by task-irrelevant, non-spatial auditory stimuli. The streams were shown for 90 seconds, and participants' task was to continuously report the perceived (left-or rightward) direction of tactile motion. In Experiment 1, each tactile stimulus was paired with an auditory beep, though odd-numbered taps were paired with an asynchronous beep, with audiotactile SOAs ranging from -75 ms to 75 ms. Perceived direction of tactile motion varied systematically with audiotactile SOA, indicative of a temporal-capture effect. In Experiment 2, two audiotactile SOAs-one short (75 ms), one long (325 ms)-were compared. The long-SOA condition preserved the crossmodal event structure (so the temporal-capture dynamics should have been similar to that in Experiment 1), but both beeps now occurred temporally close to the taps on one side (even-numbered taps). The two SOAs were found to produce opposite modulations of apparent motion, indicative of an influence of crossmodal grouping. In Experiment 3, only odd-numbered, but not even-numbered, taps were paired with auditory beeps. This abolished the temporal-capture effect and, instead, a dominant percept of apparent motion from the audiotactile side to the tactile-only side was observed independently of the SOA variation. These findings suggest that asymmetric crossmodal grouping leads to an attentional modulation of apparent motion, which inhibits crossmodal temporal-capture effects

    Distortions of Subjective Time Perception Within and Across Senses

    Get PDF
    Background: The ability to estimate the passage of time is of fundamental importance for perceptual and cognitive processes. One experience of time is the perception of duration, which is not isomorphic to physical duration and can be distorted by a number of factors. Yet, the critical features generating these perceptual shifts in subjective duration are not understood. Methodology/Findings: We used prospective duration judgments within and across sensory modalities to examine the effect of stimulus predictability and feature change on the perception of duration. First, we found robust distortions of perceived duration in auditory, visual and auditory-visual presentations despite the predictability of the feature changes in the stimuli. For example, a looming disc embedded in a series of steady discs led to time dilation, whereas a steady disc embedded in a series of looming discs led to time compression. Second, we addressed whether visual (auditory) inputs could alter the perception of duration of auditory (visual) inputs. When participants were presented with incongruent audio-visual stimuli, the perceived duration of auditory events could be shortened or lengthened by the presence of conflicting visual information; however, the perceived duration of visual events was seldom distorted by the presence of auditory information and was never perceived shorter than their actual durations. Conclusions/Significance: These results support the existence of multisensory interactions in the perception of duration and, importantly, suggest that vision can modify auditory temporal perception in a pure timing task. Insofar as distortions in subjective duration can neither be accounted for by the unpredictability of an auditory, visual or auditory-visual event, we propose that it is the intrinsic features of the stimulus that critically affect subjective time distortions

    The speed and phase of locomotion dictate saccade probability and simultaneous low-frequency power spectra.

    Get PDF
    Every day we make thousands of saccades and take thousands of steps as we explore our environment. Despite their common co-occurrence in a typical active state, we know little about the coordination between eye movements, walking behaviour and related changes in cortical activity. Technical limitations have been a major impediment, which we overcome here by leveraging the advantages of an immersive wireless virtual reality (VR) environment with three-dimensional (3D) position tracking, together with simultaneous recording of eye movements and mobile electroencephalography (EEG). Using this approach with participants engaged in unencumbered walking along a clear, level path, we find that the likelihood of eye movements at both slow and natural walking speeds entrains to the rhythm of footfall, peaking after the heel-strike of each step. Compared to previous research, this entrainment was captured in a task that did not require visually guided stepping - suggesting a persistent interaction between locomotor and visuomotor functions. Simultaneous EEG recordings reveal a concomitant modulation entrained to heel-strike, with increases and decreases in oscillatory power for a broad range of frequencies. The peak of these effects occurred in the theta and alpha range for slow and natural walking speeds, respectively. Together, our data show that the phase of the step-cycle influences other behaviours such as eye movements, and produces related modulations of simultaneous EEG following the same rhythmic pattern. These results reveal gait as an important factor to be considered when interpreting saccadic and time-frequency EEG data in active observers, and demonstrate that saccadic entrainment to gait may persist throughout everyday activities

    Measuring perception without introspection

    Get PDF
    Binocular rivalry, the perceptual alternation between incompatible monocular stimuli, is conventionally measured by asking the subject which percept is currently visible. This is problematic because the response is unverifiable, open to response bias, and falsely assumes that the perceptual experience is binary. We overcame these limitations in a new approach that does not require subjective reporting of perceptual state. A brief test stimulus was added to one eye's inducing stimulus at random times and contrasts. The test was presented at one of two spatial locations, the subject indicated which alternative had been shown, and the correctness of the response was recorded as a function of test contrast. Given the random timing of the test stimulus, it was sometimes delivered when the tested eye was dominant and, at other times, suppressed. Accordingly, the psychometric function recorded during rivalry should be a mixture of the dominance and suppression forms of the function. This was indeed the case: The probability of a correct response during rivalry was significantly less than that obtained with a binocularly congruent stimulus. The psychometric function during rivalry was well modeled as a weighted sum of the congruence curve with an assumed suppression curve. Optimal fitting provided estimates of both suppression depth and percept predominance that corresponded closely with estimates obtained with the conventional method. We have therefore characterized rivalry without the uncertainties introduced by the subject's perceptual report. This provides a model that may be applicable to the broader field of perceptual ambiguity

    Perceptual synchrony of audiovisual streams for natural and artificial motion sequences

    Get PDF
    corecore