Skip to main content
Article thumbnail
Location of Repository

View-based approaches to spatial representation in human vision

By Andrew Glennerster, Miles E. Hansard and Andrew W. Fitzgibbon


In an immersive virtual environment, observers fail to notice the expansion of a room around them and consequently make gross errors when comparing the size of objects. This result is difficult to explain if the visual system continuously generates a 3-D model of the scene based on known baseline information from interocular separation or proprioception as the observer walks. An alternative is that observers use view-based methods to guide their actions and to represent the spatial layout of the scene. In this case, they may have an expectation of the images they will receive but be insensitive to the rate at which images arrive as they walk. We describe the way in which the eye movement strategy of animals simplifies motion processing if their goal is to move towards a desired image and discuss dorsal and ventral stream processing of moving images in that context. Although many questions about view-based approaches to scene representation remain unanswered, the solutions are likely to be highly relevant to understanding biological 3-D vision

Topics: 571
Publisher: Springer
Year: 2009
OAI identifier:

Suggested articles


  1. (1995). A hierarchical analysis of alternative representations in the perception of 3-D structure from motion and stereopsis.
  2. (1994). A model of self-motion estimation within primate extrastriate visual- cortex.
  3. (2001). A sensori-motor account of vision and visual consciousness. Behavioural and Brain Sciences,
  4. (1969). A theory of cerebellar cortex.
  5. A theory of cerebellar function.
  6. (1990). A theory of phenomenal geometry and its applications. Perception and Psychophysics,
  7. (2008). Absolute and relative cues for location investigated using immersive virtual reality. doi
  8. Against direct perception.
  9. (1998). Automatic camera recovery for closed or open image sequences.
  10. Binocular distance perception.
  11. (2000). Biomimetic robot navigation. Robotics and Autonomous Systems, doi
  12. (2004). Computational analysis of the role of the hippocampus in memory. doi
  13. Fixation could simplify, not complicate, the interpretation of retinal flow. doi
  14. (1998). Gaze-centered remapping of remembered visual space in an open-loop pointing task.
  15. (2006). Humans ignore motion and stereo cues in favour of a fictional stable world. doi
  16. (2002). Humans integrate visual and haptic information in a statistically optimal fashion.
  17. (1992). Image-based homing.
  18. (1986). Integration of direction signals of image motion in the superior temporal sulcus of the macaque monkey.
  19. (1993). Is the cerebellum a Smith predictor?
  20. (1983). Landmark learning in bees: experiments and models.
  21. (1998). Learning view graphs for robot navigation. Autonomous Robots,
  22. (1988). Movements of the eyes.
  23. (2000). Multiple view geometry in computer vision.
  24. (1998). Navigation and acquisition of spatial knowledge in a virtual maze. doi
  25. (2003). Real-time simultaneous localisation and mapping with a single camera. In
  26. (1997). Saccades without eye movements.
  27. (1982). Similarities between motion parallax and stereopsis in human depth perception.
  28. (1986). Size and contrast have only small effects on the responses to faces of neurons in the cortex of the superior temporal sulcus of the monkey. doi
  29. (2005). SLAM-loop closing with visually salient features.
  30. (1992). Solving the real mysteries of visual perception: The world as an outside memory.
  31. (1999). Spatial view cells in the primate hippocampus: allocentric view not head direction or eye position or place. Cerebral Cortex, doi
  32. (2006). Stereo and motion parallax cues in human 3d vision: Can they vanish without trace? doi
  33. (1996). Stereoscopic depth constancy depends on the subject’s task.
  34. (1991). Systematic distortions of shape from stereopsis.
  35. (1979). The ecological approach to visual perception.
  36. (1998). The interaction of binocular disparity and motion parallax in deptermining perceived depth and perceived size.
  37. The interaction of binocular disparity and motion parallax in the computation of depth.
  38. (1979). The internal representation of solid shape with respect to vision.
  39. (1992). The nucleus of the optic tract: Its function in gaze stabilization and control of visual-vestibular interaction.
  40. (1990). The role of disparity-sensitive cortical neurons in signalling the direction of self-motion. doi
  41. (2000). The task-dependent use of binocular disparity and motion parallax information.
  42. (1998). View-invariant representations of familiar objects by neurons in the inferior temporal cortex. Cerebral Cortex, doi
  43. (1995). View–based cognitive mapping and path planning.
  44. (1999). Visual homing: Surfing on the epipoles.
  45. (1999). Why animals move their eyes.

To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.