17 research outputs found

    Human recovery of shape from profiles

    No full text
    A single profile of a solid object contains much information about the shape of the object. Viewing the changing profiles of a moving object provides even greater information about the shape of the object. Few computational models of this process have been applied to the human ability to recover the shape and motion of solid objects from their changing profiles. We propose a theory that relates measurable quantities of changing two-dimensional (2-D) profiles to structural properties of three-dimensional (3-D) surfaces in motion. The relevance of this theory to human perception is shown by relating theoretical predictions to existing psychophysical results as well as additional demonstrations of human recovery of shape from profiles

    An Analysis of the Motion Signal Distributions Emerging from Locomotion through a Natural Environment

    No full text
    Abstract. Some 50 years have passed since Gibson drew attention to the characteristic field of velocity vectors generated on the retina when an observer is moving through the three-dimensional world. Many theoretical, psychophysical, and physiological studies have demonstrated the use of such optic flowfields for a number of navigational tasks under laboratory conditions, but little is known about the actual flowfield structure under natural operating conditions. To study the motion information available to the visual system in the real world, we moved a panoramic imaging device outdoors on accurately defined paths and simulated a biologically inspired motion detector network to analyse the distribution of motion signals. We found that motion signals are sparsely distributed in space and that local directions can be ambiguous and noisy. Spatial or temporal integration would be required to retrieve reliable information on the local motion vectors. Nevertheless, a surprisingly simple algorithm can retrieve rather accurately the direction of heading from sparse and noisy motion signal maps without the need for such pooling. Our approach thus may help to assess the role of specific environmental and computational constraints in natural optic flow processing.

    Visual processing in Free Flight

    No full text
    Egelhaaf M. Visual processing in Free Flight. In: Jaeger D, Jung R, eds. Encyclopedia of Computational Neuroscience. New York: Springer Science+Business Media; 2015: 3180.With their miniature brains many insect groups are able to control highly aerobatic flight maneuvers and to solve spatial vision tasks, such as avoiding collisions with stationary obstacles as well as moving objects, landing on environmental structures, pursuing rapidly moving animals, or localizing a previously learnt inconspicuous goal on the basis of environmental cues. With regard to solving such tasks these insects outperform man-made autonomous flying systems, especially if computational costs and energy efficiency are taken as benchmarks. To accomplish their extraordinary performance several insect groups have been shown to actively shape the dynamics of the image flow on their eyes (“optic flow”) by the characteristic way they move when solving behavioral tasks. The neural processing of spatial information is greatly facilitated, for instance, by segregating the rotational from the translational optic flow component by way of a saccadic flight and gaze strategy. Flying insects acquire at least part of their strength as autonomous systems through active interactions with their environment, which lead to adaptive behavior in surroundings of a wide range of complexity. Model simulations and robotic implementations show that the smart biological mechanisms of motion computation and visually-guided flight control might be helpful to find technical solutions
    corecore