3 research outputs found

    Visual Distance Estimation in Static Compared to Moving Virtual Scenes

    Get PDF
    Visual motion is used to control direction and speed of self-motion and time-to-contact with an obstacle. In earlier work, we found that human subjects can discriminate between the distances of different visually simulated self-motions in a virtual scene. Distance indication in terms of an exocentric interval adjustment task, however, revealed linear correlationbetween perceived and indicated distances but with a profound distance underestimation. One possible explanation for this underestimation is the perception of visual space in virtual environments. Humans perceive visual space in natural scenes as curved, and distances are increasingly underestimated with increasing distance from the observer. Such spatial compression may also exist in our virtual environment. We therefore surveyed perceived visual space in a static virtual scene. We asked observers to compare two horizontal depth intervals, similar to experiments performed in natural space. Subjects had to indicate the size of one depth interval relative to a second interval. Our observers perceived visual space in the virtual environment as compressed, similar to the perception found in natural scenes. However, the nonlinear depth function we found can not explain the observed distance underestimation of visual simulated self-motions in the same environment.El movimiento visual se emplea en el control de la direcci贸n y la velocidad de la autolocomoci贸n y, tambi茅n, para conocer el tiempo de contacto con un obst谩culo. En trabajos anteriores encontramos que los observadores humanos pueden discriminar entre las distancias de diferentes auto-locomociones simuladas visualmente en una escena virtual. La indicaci贸n de la distancia mediante una tarea de ajuste de intervalo exoc茅ntrico, sin embargo, revel贸 una correlaci贸n lineal entre las distancias percibidas y las indicadas, pero con una gran subestimaci贸n de la distancia. Una posible explicaci贸n de esta subestimaci贸n se basa en las caracter铆sticas de la percepci贸n visual del espacio en ambientes virtuales. En las escenas naturales los humanos percibimos el espacio visual como curvado, y las distancias se subestiman con el incremento de la separaci贸n respecto al observador. Esta compresi贸n espacial tambi茅n puede existir en nuestro ambiente virtual. Por ello, se decidi贸 evaluar el espacio visual percibido en una escena est谩tica virtual. Pedimos a los observadores que comparasen dos intervalos de profundidad horizontal, similares a experimentos llevados a cabo en el espacio natural. Los sujetos deb铆an indicar el tama帽o de un intervalo de profundidad con respecto a un segundo intervalo. Nuestros observadores percib铆an el espacio visual en el ambiente virtual como comprimido, similar a la percepci贸n encontrada en escenas naturales. Sin embargo, la funci贸n no lineal de profundidad que ncontramos no puede explicar la subestimaci贸n observada de la distancia de las autolocomociones visuales simuladas en el mismo ambiente

    Virtual Odometry From Visual Flow

    No full text
    We investigate how visual motion registered during one鈥檚 own movement through a structured world can be used to gauge travel distance. Estimating absolute travel distance from the visual flow induced in the optic array of a moving observer is problematic because optic flow speeds co-vary with the dimensions of the environment and are thus subject to an environment specific scale factor. Discrimination of the distances of two simulated self-motions of different speed and duration is reliably possible from optic flow, however, if the visual environment is the same for both motions, because the scale factors cancel in this case. 1, 2 Here, we ask whether a distance estimate obtained from optic flow can be transformed into a spatial interval in the same visual environment. Subjects viewed a simulated self-motion sequence on a large (90 by 90 deg) projection screen or in a computer animated virtual environment (CAVE) with completely immersive, stereographic, head-yoked projection, that extended 180deg horizontally and included the floor space in front of the observer. The sequence depicted selfmotion over a ground plane covered with random dots. Simulated distances ranged from 1.5 to 13 meters with variable speed and duration of the movement. After the movement stopped, the screen depicted a stationary view of the scene and two horizontal lines appeared on the ground in front of the observer. The subject had to adjust one of these lines such that the spatial interval between the lines matched the distance traveled during th

    Virtual odometry from visual flow

    No full text
    corecore