Near range path navigation using LGMD visual neural networks

Abstract

In this paper, we proposed a method for near range path navigation for a mobile robot by using a pair of biologically inspired visual neural network – lobula giant movement detector (LGMD). In the proposed binocular style visual system, each LGMD processes images covering a part of the wide field of view and extracts relevant visual cues as its output. The outputs from the two LGMDs are compared and translated into executable motor commands to control the wheels of the robot in real time. Stronger signal from the LGMD in one side pushes the robot away from this side step by step; therefore, the robot can navigate in a visual environment naturally with the proposed vision system. Our experiments showed that this bio-inspired system worked well in different scenarios

Similar works

Full text

thumbnail-image

University of Lincoln Institutional Repository

redirect
Last time updated on 28/06/2012

Having an issue?

Is data on this page outdated, violates copyrights or anything else? Report the problem now and we will take corresponding actions after reviewing your request.