This paper presents some experiments of a real-time navigation system driven by two cameras pointed laterally to the navigation direction (divergent stereo). The approach is based on the observation that the stereo set-up traditionally used in vision (i.e. with the optical axis pointing forward) may not be the best one for navigation, and particularly for continuous control of a mobile actor moving in unconstrained environment. Similarly to what has been proposed [5, 3], our approach [8, 10] is that for navigation purposes the driving information is not distance (as it is obtainable by a stereo set-up) but motion and, more precisely, by optical flow information computed over different areas of the visual field. Following this idea, a mobile vehicle has been equipped with a pair of cameras looking laterally (much like honeybees) and a controller based on fast, real-time computation of optical flow, has been implemented. The control of the mobile robot (ROBEE) is based upon the compariso..

Similar works

Full text



Last time updated on 22/10/2014

This paper was published in CiteSeerX.

Having an issue?

Is data on this page outdated, violates copyrights or anything else? Report the problem now and we will take corresponding actions after reviewing your request.