1,393 research outputs found
Recommended from our members
Insect-inspired visual navigation for flying robots
This paper discusses the implementation of insect-inspired visual navigation strategies in flying robots, in particular focusing on the impact of changing height. We start by assessing the information available at different heights for visual homing in natural environments, comparing results from an open environment against one where trees and bushes are closer to the camera. We then test a route following algorithm using a gantry robot and show that a robot would be able to successfully navigate a route at a variety of heights using images saved at a different height
A contribution to vision-based autonomous helicopter flight in urban environments
A navigation strategy that exploits the optic flow and inertial information to continuously avoid collisions with both lateral and frontal obstacles has been used to control a simulated helicopter flying autonomously in a textured urban environment. Experimental results demonstrate that the corresponding controller generates cautious behavior, whereby the helicopter tends to stay in the middle of narrow corridors, while its forward velocity is automatically reduced when the obstacle density increases. When confronted with a frontal obstacle, the controller is also able to generate a tight U-turn that ensures the UAV’s survival. The paper provides comparisons with related work, and discusses the applicability of the approach to real platforms
Insect-Inspired Visual Perception for Flight Control and Collision Avoidance
Flying robots are increasingly used for tasks such as aerial mapping, fast exploration, video footage and monitoring of buildings.
Autonomous flight at low altitude in cluttered and unknown environments is an active research topic because it poses challenging perception and control problems.
Traditional methods for collision-free navigation at low altitude require heavy resources to deal with the complexity of natural environments, something that limits the autonomy and the payload of flying robots.
Flying insects, however, are able to navigate safely and efficiently using vision as the main sensory modality.
Flying insects rely on low resolution, high refresh rate, and wide-angle compound eyes to extract angular image motion and move in unstructured environments.
These strategies result in systems that are physically and computationally lighter than those often found in high-definition stereovision.
Taking inspiration from insects offers great potential for building small flying robots capable of navigating in cluttered environments using lightweight vision sensors.
In this thesis, we investigate insect perception of visual motion and insect vision based flight control in cluttered environments.
We use the knowledge gained through the modelling of neural circuits and behavioural experiments to develop flying robots with insect-inspired control strategies for goal-oriented navigation in complex environments.
We start by exploring insect perception of visual motion.
We present a study that reconciles an apparent contradiction in the literature for insect visual control: current models developed to explain insect flight behaviour rely on the measurement of optic flow, however the most prominent neural model for visual motion extraction (the Elementary Motion Detector, or EMD) does not measure optic flow.
We propose a model for unbiased optic flow estimation that relies on comparing the output of multiple EMDs pointed in varying viewing directions.
Our model is of interest of both engineers and biologists because it is computationally more efficient than other optic flow estimation algorithms, and because it represents a biologically plausible model for optic flow extraction in insect neural systems.
We then focus on insect flight control strategies in the presence of obstacles.
By recording the trajectories of bumblebees (Bombus terrestris), and by comparing them to simulated flights, we show that bumblebees rely primarily on the frontal part of their field of view, and that they pool optic flow in two different manners for the control of flight speed and of lateral position.
For the control of lateral position, our results suggest that bumblebees selectively react to the portions of the visual field where optic flow is the highest, which correspond to the closest obstacles.
Finally, we tackle goal-oriented navigation with a novel algorithm that combines aspects of insect perception and flight control presented in this thesis -- like the detection of fastest moving objects in the frontal visual field -- with other aspects of insect flight known from the literature such as saccadic flight pattern.
Through simulations, we demonstrate autonomous navigation in forest-like environments using only local optic flow information and assuming knowledge about the direction to the navigation goal
Near range path navigation using LGMD visual neural networks
In this paper, we proposed a method for near range path navigation for a mobile robot by using a pair of biologically
inspired visual neural network – lobula giant movement detector (LGMD). In the proposed binocular style visual system, each LGMD processes images covering a part of the wide field of view and extracts relevant visual cues as its output. The outputs from the two LGMDs are compared and translated into executable motor commands to control the wheels of the robot in real time. Stronger signal from the LGMD in one side pushes the robot away from this side step by step; therefore, the robot can navigate in a visual environment naturally with the proposed vision system. Our experiments showed that this bio-inspired system worked well in different scenarios
Recommended from our members
Exploring the robustness of insect-inspired visual navigation for flying robots
Having previously developed and tested insect-inspired visual navigation algorithms for ground-based agents, we here investigate their robustness when applied to agents moving in three dimensions, to assess if they are applicable to both flying insects and robots, focusing on the impact and potential utility of changes in height. We first demonstrate that a robot implementing a route navigation algorithm can successfully navigate a route through an indoor environment at a variety of heights, even using images saved at different heights. We show that that in our environments, the efficacy of route navigation is increased with increasing height and also, for those environments, that there is better transfer of information when using images learnt at a high height to navigate when flying lower, than the other way around. This suggests that there is perhaps an adaptive value to the storing and use of views from increased height. To assess the limits to this result, we show that it is possible for a ground-based robot to recover the correct heading when using goal images stored from the perspective of a quadcopter. Through the robustness of this bio-inspired algorithm, we thus demonstrate the benefits of the ALife approach
Bioinspired engineering of exploration systems for NASA and DoD
A new approach called bioinspired engineering of exploration systems (BEES) and its value for solving pressing NASA and DoD needs are described. Insects (for example honeybees and dragonflies) cope remarkably well with their world, despite possessing a brain containing less than 0.01% as many neurons as the human brain. Although most insects have immobile eyes with fixed focus optics and lack stereo vision, they use a number of ingenious, computationally simple strategies for perceiving their world in three dimensions and navigating successfully within it. We are distilling selected insect-inspired strategies to obtain novel solutions for navigation, hazard avoidance, altitude hold, stable flight, terrain following, and gentle deployment of payload. Such functionality provides potential solutions for future autonomous robotic space and planetary explorers. A BEES approach to developing lightweight low-power autonomous flight systems should be useful for flight control of such biomorphic flyers for both NASA and DoD needs. Recent biological studies of mammalian retinas confirm that representations of multiple features of the visual world are systematically parsed and processed in parallel. Features are mapped to a stack of cellular strata within the retina. Each of these representations can be efficiently modeled in semiconductor cellular nonlinear network (CNN) chips. We describe recent breakthroughs in exploring the feasibility of the unique blending of insect strategies of navigation with mammalian visual search, pattern recognition, and image understanding into hybrid biomorphic flyers for future planetary and terrestrial applications. We describe a few future mission scenarios for Mars exploration, uniquely enabled by these newly developed biomorphic flyers
An Inexpensive Flying Robot Design for Embodied Robotics Research
Flying insects are capable of a wide-range of flight and cognitive behaviors which are not currently understood. The replication of these capabilities is of interest to miniaturized robotics, because they share similar size, weight, and energy constraints. Currently, embodiment of insect behavior is primarily done on ground robots which utilize simplistic sensors and have different constraints to flying insects. This limits how much progress can be made on understanding how biological systems fundamentally work. To address this gap, we have developed an inexpensive robotic solution in the form of a quadcopter aptly named BeeBot. Our work shows that BeeBot can support the necessary payload to replicate the sensing capabilities which are vital to bees' flight navigation, including chemical sensing and a wide visual field-of-view. BeeBot is controlled wirelessly in order to process this sensor data off-board; for example, in neural networks. Our results demonstrate the suitability of the proposed approach for further study of the development of navigation algorithms and of embodiment of insect cognition
Taking Inspiration from Flying Insects to Navigate inside Buildings
These days, flying insects are seen as genuinely agile micro air vehicles fitted with smart sensors and also parsimonious in their use of brain resources. They are able to visually navigate in unpredictable and GPS-denied environments. Understanding how such tiny animals work would help engineers to figure out different issues relating to drone miniaturization and navigation inside buildings. To turn a drone of ~1 kg into a robot, miniaturized conventional avionics can be employed; however, this results in a loss of their flight autonomy. On the other hand, to turn a drone of a mass between ~1 g (or less) and ~500 g into a robot requires an innovative approach taking inspiration from flying insects both with regard to their flapping wing propulsion system and their sensory system based mainly on motion vision in order to avoid obstacles in three dimensions or to navigate on the basis of visual cues. This chapter will provide a snapshot of the current state of the art in the field of bioinspired optic flow sensors and optic flow-based direct feedback loops applied to micro air vehicles flying inside buildings
- …