44 research outputs found

    Optic flow information influencing heading perception during rotation

    Get PDF
    Poster Session - Perception and Action: abstract no. 22.34We investigated what roles global spatial frequency, surface structure, and foreground motion play in heading perception during simulated rotation from optic flow. The display …postprin

    Disentangling the effects of object position and motion on heading judgments in the presence of a moving object

    Get PDF
    Tuesday Morning Posters - Motion Perception: Optic flow and heading: no. 53.4026Previous research has found that moving objects bias heading perception only when they occlude the focus of expansion (FOE) in the background optic flow, with the direction of bias depending on whether the moving object was approached or at a fixed distance from the moving observer. However, the effect of object motion on heading perception was confounded with object position in previous studies. Here, we disentangled the contributions of object motion and position to heading bias. In each 1s trial, the display simulated forward observer motion at 1 m/s through a ...postprin

    During self-movement humans are better at judging whether an object is moving (flow parsing) than whether they will hit it (heading)

    Get PDF
    Tuesday Morning Posters - Motion Perception: Optic flow and heading: no. 53.4029During locomotion we can use information in the retinal flow field to judge whether we will pass to the left or right of an object in the scene (heading). We can also use information in retinal flow to judge whether an object is moving relative to the scene (flow parsing). Both judgements rely on the brain identifying optic flow (global patterns of retinal motion that are characteristic of self-movement). How does the precision of these two judgments compare? Differences or similarities in precision may provide some insight into the underpinning mechanisms. We designed stimuli that allowed direct comparison of the precision of the two judgements. In the heading task, we ...postprin

    The surprising utility of target drift in natural heading judgements

    Get PDF
    Sunday Morning Posters - Perception and Action: Driving and navigating: no. 33.3026Gibson (1950) proposed that optic flow provides information about the direction of self-motion (heading) relative to objects in the environment. Llewellyn (1971) pointed out that the change in egocentric direction of an object, “drift”, also provides information about whether an observer is passing to the left or right of the object. We compared the precision of heading judgements with flow and drift cues, presented in isolation, and together. With flow alone, observers were quite precise (< 1°), but observers were more precise with drift, and equally precise with drift alone and with both flow and drift. Next we examined how precision changed with display duration (0.2-1.6s). There was evidence of cue-combination at 0.2s but at longer durations the precisions for the ...postprin

    Eye tracking: empirical foundations for a minimal reporting guideline

    Get PDF
    In this paper, we present a review of how the various aspects of any study using an eye tracker (such as the instrument, methodology, environment, participant, etc.) affect the quality of the recorded eye-tracking data and the obtained eye-movement and gaze measures. We take this review to represent the empirical foundation for reporting guidelines of any study involving an eye tracker. We compare this empirical foundation to five existing reporting guidelines and to a database of 207 published eye-tracking studies. We find that reporting guidelines vary substantially and do not match with actual reporting practices. We end by deriving a minimal, flexible reporting guideline based on empirical research (Section "empirically based minimal reporting guideline")

    Stereo visual cues help object motion perception during self-motion

    No full text
    This journal suppl. contains the ECVP 2012 conference abstractsPosters: 3D PerceptionOpen Access JournalRecent studies have suggested that the visual system subtracts the optic flow pattern experienced during self-motion from the projected retinal motion of the environment to recover object motion, a phenomenon called 'flow parsing' (Warren and Rushton, 2007 Journal of Vision7(11) 2, 1-11). In this experiment, we tested how adding stereo visual cues to help accurate depth perception of a moving object relative to the flow field affected the flow parsing process. The displays (26°x26°, 500ms) simulated an observer approaching a frontal plane that was composed of 300 randomly placed dots. A red probe dot moved vertically over this plane or over the image plane of the projection screen through a midpoint at 3° or 5° eccentricity. A horizontal component (along the world X-axis) under control of an adaptive staircase was added to the probe dot's vertical motion to determine when the probe motion was perceived as vertical. Participants viewed the display with and without stereo visual cues. We found that with stereo visual cues, flow parsing gains were significantly higher when the probe moved over the frontal plane, but significantly lower when it moved over the screen surface. We conclude that stereo visual cues help veridical perception of object motion during self-motion.link_to_OA_fulltex

    Retinal information influencing heading perception during rotation

    No full text
    Open Access JournalPoster Session - Perception and action: Locomotion: 53.510It has been reported that humans can accurately perceive heading during simulated eye rotation using information from optic flow alone. Here we investigated what roles spatial frequency content, surface structure, foreground motion, and expansion information play in heading perception during ...link_to_OA_fulltextThe 11th Annual Meeting of the Vision Science Society (VSS 2011), Naples, FL., 6-11 May 2011. In Journal of Vision, 2011, v. 11 n. 11, article 90

    Angular, speed and density tuning of flow parsing

    No full text
    Poster PresentationSession: Motion Perception: Local motion and optic flowRecent studies have suggested that the visual system subtracts the optic flow experienced during self-motion from the retinal motion of the environment to recover scene-relative object motion, a phenomenon called 'flow parsing'. The psychophysical characteristics of this process however remain unclear. Here, by measuring the gain with which flow parsing is performed, we examined how flow parsing is affected by the angle between the object motion and the background flow at the object's location (Experiment 1), the self- or the object motion speed (Experiments 2 and 3), and the density of the elements in the background flow (Experiment 3). In each 0.5-s trial, the display (83°H x 83°V, 60 Hz) simulated forward self-motion at .5–5 m/s toward a frontal plane covered with 10–5000 white random dots placed at 2 m. A red probe dot moved leftward or rightward at 1–10 deg/s on the frontal plane. A component toward the FOE was added to the probe's horizontal retinal motion under the control of an adaptive staircase to determine when the probe was perceived to move horizontally. The results show that flow parsing was strongly affected by each of the factors we varied. Specifically, flow parsing gain decreased exponentially as the object motion direction deviates from the background flow at its retinal location. Surprisingly, flow parsing gain also decreased exponentially with the increase of the simulated self-motion speed. Flow parsing gain increased linearly with the object motion speed and increased logarithmically with the density of the background flow. We conclude that while increasing the object motion speed and the number of elements in the scene helps the perception of scene-relative object motion during self-motion, the performance is best at normal walking speed and when the object moves in the same direction as the background flow at its retinal location.link_to_OA_fulltex

    Influence of optic flow on the control of heading and target egocentric direction during steering toward a goal.

    No full text
    Although previous studies have shown that people use both optic flow and target egocentric direction to walk or steer toward a goal, it remains in question how enriching the optic flow field affects the control of heading specified by optic flow and the control of target egocentric direction during goal-oriented locomotion. In the current study, we used a control-theoretic approach to separate the control response specific to these two cues in the visual control of steering toward a goal. The results showed that the addition of optic flow information (such as foreground motion and global flow) in the display improved the overall control precision, the amplitude, and the response delay of the control of heading. The amplitude and the response delay of the control of target egocentric direction were, however, not affected. The improvement in the control of heading with enriched optic flow displays was mirrored by an increase in the accuracy of heading perception. The findings provide direct support for the claim that people use the heading specified by optic flow as well as target egocentric direction to walk or steer toward a goal and suggest that the visual system does not internally weigh these two cues for goal-oriented locomotion control

    A Bayesian model for estimating observer translation and rotation from optic flow and extra-retinal input.

    No full text
    We present a Bayesian ideal observer model that estimates observer translation and rotation from optic flow and an extra-retinal eye movement signal. The model assumes a rigid environment and noise in velocity measurements, and that eye movement provides a probabilistic cue for rotation. The model can simulate human heading perception across a range of conditions, including: translation with simulated vs. actual eye rotations, environments with various depth structures, and the presence of independently moving objects.link_to_OA_fulltex
    corecore