45 research outputs found

    Crossmodal duration perception involves perceptual grouping, temporal ventriloquism, and variable internal clock rates

    Get PDF
    Here, we investigate how audiovisual context affects perceived event duration with experiments in which observers reported which of two stimuli they perceived as longer. Target events were visual and/or auditory and could be accompanied by nontargets in the other modality. Our results demonstrate that the temporal information conveyed by irrelevant sounds is automatically used when the brain estimates visual durations but that irrelevant visual information does not affect perceived auditory duration (Experiment 1). We further show that auditory influences on subjective visual durations occur only when the temporal characteristics of the stimuli promote perceptual grouping (Experiments 1 and 2). Placed in the context of scalar expectancy theory of time perception, our third and fourth experiments have the implication that audiovisual context can lead both to changes in the rate of an internal clock and to temporal ventriloquism-like effects on perceived on- and offsets. Finally, intramodal grouping of auditory stimuli diminished any crossmodal effects, suggesting a strong preference for intramodal over crossmodal perceptual grouping (Experiment 5)

    Perceived Surface Slant Is Systematically Biased in the Actively-Generated Optic Flow

    Get PDF
    Humans make systematic errors in the 3D interpretation of the optic flow in both passive and active vision. These systematic distortions can be predicted by a biologically-inspired model which disregards self-motion information resulting from head movements (Caudek, Fantoni, & Domini 2011). Here, we tested two predictions of this model: (1) A plane that is stationary in an earth-fixed reference frame will be perceived as changing its slant if the movement of the observer's head causes a variation of the optic flow; (2) a surface that rotates in an earth-fixed reference frame will be perceived to be stationary, if the surface rotation is appropriately yoked to the head movement so as to generate a variation of the surface slant but not of the optic flow. Both predictions were corroborated by two experiments in which observers judged the perceived slant of a random-dot planar surface during egomotion. We found qualitatively similar biases for monocular and binocular viewing of the simulated surfaces, although, in principle, the simultaneous presence of disparity and motion cues allows for a veridical recovery of surface slant

    The neural correlates of consciousness and attention: Two sister processes of the brain

    Get PDF

    Event-Predicate Detection in the Debugging of Distributed Applications

    No full text
    Trends in the development of computer hardware are making the use of distributed systems increasingly attractive. The collection of event-trace data and the construction of process-time diagrams can provide a useful visualization tool. In practical situations, however, these diagrams are too large for users to find them comprehensible. The ability to detect and locate arbitrary (complex) predicates within an event trace can help to alleviate this problem. This thesis enumerates five classes of problems that a successful event- detection strategy should be able to identify: phase transitions, mutual- exclusion violations, subroutines, communication symmetry, and performance bottlenecks. Some previous efforts in this area offer an expressivity which is close to that required to meet these goals, but are hampered by an insufficient understanding of the partial order which underlies causality in a distributed-execution trace. This work defines a partial-order precedence relationship for co..

    Audiovisual Delay as a Novel Cue to Visual Distance.

    No full text
    For audiovisual sensory events, sound arrives with a delay relative to light that increases with event distance. It is unknown, however, whether humans can use these ubiquitous sound delays as an information source for distance computation. Here, we tested the hypothesis that audiovisual delays can both bias and improve human perceptual distance discrimination, such that visual stimuli paired with auditory delays are perceived as more distant and are thereby an ordinal distance cue. In two experiments, participants judged the relative distance of two repetitively displayed three-dimensional dot clusters, both presented with sounds of varying delays. In the first experiment, dot clusters presented with a sound delay were judged to be more distant than dot clusters paired with equivalent sound leads. In the second experiment, we confirmed that the presence of a sound delay was sufficient to cause stimuli to appear as more distant. Additionally, we found that ecologically congruent pairing of more distant events with a sound delay resulted in an increase in the precision of distance judgments. A control experiment determined that the sound delay duration influencing these distance judgments was not detectable, thereby eliminating decision-level influence. In sum, we present evidence that audiovisual delays can be an ordinal cue to visual distance

    Journal of Vestibular Research 13 (2003) 265--271 265 IOS Press

    No full text
    We measured how much the visual world could be moved during various head rotations and translations and still be perceived as visually stable. Using this as a monitor of how well subjects know about their own movement, we compared performance in different directions relative to gravity. For head rotations, we compared the range of visual motion judged compatible with a stable environment while rotating around an axis orthogonal to gravity (where rotation created a rotating gravity vector across the otolith macula), with judgements made when rotation was around an earth-vertical axis. For translations, we compared the corresponding range of visual motion when translation was parallel to gravity (when imposed accelerations added to or subtracted from gravity), with translations orthogonal to gravity. Ten subjects wore a head-mounted display and made active head movements at 0.5 Hz that were monitored by a low-latency mechanical tracker. Subjects adjusted the ratio between head and image motion until the display appeared perceptually stable. For neither rotation nor translation were there any differences in judgements of perceptual stability that depended on the direction of the movement with respect to the direction of gravity

    (Jaekl et al., 2015) Audiovisual delay as a novel cue to distance: Exp. 2 data

    No full text
    <p>Each row is an individual subject</p> <p>Each value is proportion of "distance increase responses" (out of 20 trials)</p> <p>Each column is a different AV asynchrony (as shown in xdata)</p

    Experiment 1.

    No full text
    <p>(A) Stimulus timeline. Right and left clusters alternated continuously, appearing for 225 ms with a 600 ms inter-stimulus interval. Sounds paired with one cluster preceded visual onset while sounds paired with the other cluster were delayed by an equal amount that ranged between 0 and 100 ms. Time one and time two are illustrated in a spatial manner in 1B. (B) Spatial arrangement of alternating dot clusters as conveyed by stereoscopic depth. At ‘Time 1,’ the right cluster was presented at a stereoscopically defined distance and paired with a sound delay. At ‘Time 2’ after an interstimulus interval, the left cluster was presented at a different distance (here shown as more distant) and paired with a sound lead. Participants adjusted the relative distance of two alternating dot clusters until they appeared to be at the same perceived distance. (C) Experimental rationale. If the presence of sound delays increases perceived visual distance, the cluster presented with a sound delay would need to be shown physically closer for the two clusters to appear equidistant. (D) Averaged median adjustments of stereo disparity. Positive biases indicate that dot clusters presented with sound delays were perceived as more distant than clusters with sound leads (**p = 0.015, *p = 0.048). Error bars are 95% confidence intervals from the bootstrap analysis as described in Materials and Methods.</p
    corecore