11 research outputs found

    Angular Velocity Estimation of Image Motion Mimicking the Honeybee Tunnel Centring Behaviour

    Get PDF
    Insects use visual information to estimate angular velocity of retinal image motion, which determines a variety of flight behaviours including speed regulation, tunnel centring and visual navigation. For angular velocity estimation, honeybees show large spatial-independence against visual stimuli, whereas the previous models have not fulfilled such an ability. To address this issue, we propose a bio-plausible model for estimating the image motion velocity based on behavioural experiments of the honeybee flying through patterned tunnels. The proposed model contains mainly three parts, the texture estimation layer for spatial information extraction, the delay-and-correlate layer for temporal information extraction and the decoding layer for angular velocity estimation. This model produces responses that are largely independent of the spatial frequency in grating experiments. And the model has been implemented in a virtual bee for tunnel centring simulations. The results coincide with both electro-physiological neuron spike and behavioural path recordings, which indicates our proposed method provides a better explanation of the honeybee’s image motion detection mechanism guiding the tunnel centring behaviour

    Bio-inspired Neural Networks for Angular Velocity Estimation in Visually Guided Flights

    Get PDF
    Executing delicate flight maneuvers using visual information is a huge challenge for future robotic vision systems. As a source of inspiration, insects are quite apt at navigating in woods and landing on surfaces which require delicate visual perception and flight control. The exquisite sensitivity of insects for image motion speed, as revealed recently, is coming from a class of specific neurons called descending neurons. Some of the descending neurons have demonstrated angular velocity selectivity as the image motion speed varies in retina. Build a quantitative angular velocity detection model is the first step for not only further understanding of the biological visual system, but also providing robust and economic solutions of visual motion perception for an artificial visual system. This thesis aims to explore biological image processing methods for motion speed detection in visually guided flights. The major contributions are summarized as follows. We have presented an angular velocity decoding model (AVDM), which estimates the visual motion speed combining both textural and temporal information from input signals. The model consists of three parts: elementary motion detection circuits, wide-field texture estimation pathway and angular velocity decoding layer. The model estimates the angular velocity very well with improved spatial frequency independence compared to the state-of-the-art angular velocity detecting models, when firstly tested by moving sinusoidal gratings. This spatial independence is vital to account for the honeybee’s flight behaviors. We have also investigated the spatial and temporal resolutions of honeybees to get a bio-plausible parameter setting for explaining these behaviors. To investigate whether the model can account for observations of tunnel centering behaviors of honeybees, the model has been implemented in a virtual bee simulated by the game engine Unity. The simulation results of a series of experiments show that the agent can adjust its position to fly through patterned tunnels by balancing the angular velocities estimated on both eyes under several circumstances. All tunnel stimulations reproduce similar behaviors of real bees, which indicate that our model does provide a possible explanation for estimating the image velocity and can be used for MAV’s flight course regulation in tunnels. What’s more, to further verify the robustness of the model, the visually guided terrain following simulations have been carried out with a closed-loop control scheme to restore a preset angular velocity during the flight. The simulation results of successfully flying over the undulating terrain verify the feasibility and robustness of the AVDM performing in various application scenarios, which shows its potential in applications of micro aerial vehicle’s terrain following. In addition, we have also applied the AVDM in grazing landing using only visual information. A LGMD neuron is also introduced to avoid collision and to trigger the hover phase, which ensures the safety of landing. By applying honeybee’s landing strategy of keeping constant angular velocity, we have designed a close-loop control scheme with an adaptive gain to control landing dynamic using AVDM response as input. A series of controlled trails have been designed in Unity platform to demonstrate the effectiveness of the proposed model and control scheme for visual landing under various conditions. The proposed model could be implemented into real small robots to investigate the robustness in real landing scenarios in near future

    Insect-Inspired Visual Perception for Flight Control and Collision Avoidance

    Get PDF
    Flying robots are increasingly used for tasks such as aerial mapping, fast exploration, video footage and monitoring of buildings. Autonomous flight at low altitude in cluttered and unknown environments is an active research topic because it poses challenging perception and control problems. Traditional methods for collision-free navigation at low altitude require heavy resources to deal with the complexity of natural environments, something that limits the autonomy and the payload of flying robots. Flying insects, however, are able to navigate safely and efficiently using vision as the main sensory modality. Flying insects rely on low resolution, high refresh rate, and wide-angle compound eyes to extract angular image motion and move in unstructured environments. These strategies result in systems that are physically and computationally lighter than those often found in high-definition stereovision. Taking inspiration from insects offers great potential for building small flying robots capable of navigating in cluttered environments using lightweight vision sensors. In this thesis, we investigate insect perception of visual motion and insect vision based flight control in cluttered environments. We use the knowledge gained through the modelling of neural circuits and behavioural experiments to develop flying robots with insect-inspired control strategies for goal-oriented navigation in complex environments. We start by exploring insect perception of visual motion. We present a study that reconciles an apparent contradiction in the literature for insect visual control: current models developed to explain insect flight behaviour rely on the measurement of optic flow, however the most prominent neural model for visual motion extraction (the Elementary Motion Detector, or EMD) does not measure optic flow. We propose a model for unbiased optic flow estimation that relies on comparing the output of multiple EMDs pointed in varying viewing directions. Our model is of interest of both engineers and biologists because it is computationally more efficient than other optic flow estimation algorithms, and because it represents a biologically plausible model for optic flow extraction in insect neural systems. We then focus on insect flight control strategies in the presence of obstacles. By recording the trajectories of bumblebees (Bombus terrestris), and by comparing them to simulated flights, we show that bumblebees rely primarily on the frontal part of their field of view, and that they pool optic flow in two different manners for the control of flight speed and of lateral position. For the control of lateral position, our results suggest that bumblebees selectively react to the portions of the visual field where optic flow is the highest, which correspond to the closest obstacles. Finally, we tackle goal-oriented navigation with a novel algorithm that combines aspects of insect perception and flight control presented in this thesis -- like the detection of fastest moving objects in the frontal visual field -- with other aspects of insect flight known from the literature such as saccadic flight pattern. Through simulations, we demonstrate autonomous navigation in forest-like environments using only local optic flow information and assuming knowledge about the direction to the navigation goal

    Biologically Inspired Visual Control of Flying Robots

    Get PDF
    Insects posses an incredible ability to navigate their environment at high speed, despite having small brains and limited visual acuity. Through selective pressure they have evolved computationally efficient means for simultaneously performing navigation tasks and instantaneous control responses. The insect’s main source of information is visual, and through a hierarchy of processes this information is used for perception; at the lowest level are local neurons for detecting image motion and edges, at the higher level are interneurons to spatially integrate the output of previous stages. These higher level processes could be considered as models of the insect's environment, reducing the amount of information to only that which evolution has determined relevant. The scope of this thesis is experimenting with biologically inspired visual control of flying robots through information processing, models of the environment, and flight behaviour. In order to test these ideas I developed a custom quadrotor robot and experimental platform; the 'wasp' system. All algorithms ran on the robot, in real-time or better, and hypotheses were always verified with flight experiments. I developed a new optical flow algorithm that is computationally efficient, and able to be applied in a regular pattern to the image. This technique is used later in my work when considering patterns in the image motion field. Using optical flow in the log-polar coordinate system I developed attitude estimation and time-to-contact algorithms. I find that the log-polar domain is useful for analysing global image motion; and in many ways equivalent to the retinotopic arrange- ment of neurons in the optic lobe of insects, used for the same task. I investigated the role of depth in insect flight using two experiments. In the first experiment, to study how concurrent visual control processes might be combined, I developed a control system using the combined output of two algorithms. The first algorithm was a wide-field optical flow balance strategy and the second an obstacle avoidance strategy which used inertial information to estimate the depth to objects in the environment - objects whose depth was significantly different to their surround- ings. In the second experiment I created an altitude control system which used a model of the environment in the Hough space, and a biologically inspired sampling strategy, to efficiently detect the ground. Both control systems were used to control the flight of a quadrotor in an indoor environment. The methods that insects use to perceive edges and control their flight in response had not been applied to artificial systems before. I developed a quadrotor control system that used the distribution of edges in the environment to regulate the robot height and avoid obstacles. I also developed a model that predicted the distribution of edges in a static scene, and using this prediction was able to estimate the quadrotor altitude

    Evolution in 3D

    Get PDF
    PhD ThesisThis thesis explores the mechanisms underlying motion vision in the praying mantis (Sphodromantis lineola) and how this visual predator perceives camouflaged prey. By recording the mantis optomotor response to wide-field motion I was able to define the mantis Dmax, the point where a pattern is displaced by such a distance that coherent motion is no longer perceived. This allowed me to investigate the spatial characteristics of the insect wide field motion processing pathway. The insect Dmax was found to be very similar to that observed in humans which suggests similar underlying motion processing mechanisms; whereby low spatial frequency local motion is being pooled over a larger visual area compared to higher spatial frequency motion. By recording the mantis tracking response to computer generated targets, I was able to investigate whether there are any benefits of background matching when prey are moving and whether pattern influences the predatory response of the mantis towards prey. I found that only prey with large pattern elements benefit from background matching during movement; and above all prey which remain un-patterned but match the mean luminance of the background receive the greatest survival advantage. Additionally, I examined the effects of background motion on the tracking response of the mantis towards moving prey. By using a computer generated target as prey, I investigated the benefits associated with matching background motion as a protective strategy to reduce the risk of detection by predators. I found the mantis was able to successfully track a moving target in the presence of background My results suggests that although there are no overall benefits for prey to match background motion, it is costly to move out of phase with the background motion. Finally, I examined the contrast sensitivity of the mantis wide-field and small target motion detection pathways. Using the mantis tracking response to small targets and the optomotor response to wide-field motion; I measured the distinct temporal and spatial signatures of each pathway. I found the mantis wide-field and small target movement detecting pathways are each tuned to a different set of spatial and temporal frequencies. The wide-field motion detecting pathway has a high sensitivity to a broad range of spatio-temporal frequencies making it sensitive to a broad range of velocities; whereas the small-target motion-detection pathway has a high sensitivity to a narrow set of spatio-temporal combinations with optimal sensitivity to targets with a low spatial frequencymotion

    Biologically Inspired Vision and Control for an Autonomous Flying Vehicle

    Get PDF
    This thesis makes a number of new contributions to control and sensing for unmanned vehicles. I begin by developing a non-linear simulation of a small unmanned helicopter and then proceed to develop new algorithms for control and sensing using the simulation. The work is field-tested in successful flight trials of biologically inspired vision and neural network control for an unstable rotorcraft. The techniques are more robust and more easily implemented on a small flying vehicle than previously attempted methods. ¶ ..

    Proceedings of the International Micro Air Vehicles Conference and Flight Competition 2017 (IMAV 2017)

    Get PDF
    The IMAV 2017 conference has been held at ISAE-SUPAERO, Toulouse, France from Sept. 18 to Sept. 21, 2017. More than 250 participants coming from 30 different countries worldwide have presented their latest research activities in the field of drones. 38 papers have been presented during the conference including various topics such as Aerodynamics, Aeroacoustics, Propulsion, Autopilots, Sensors, Communication systems, Mission planning techniques, Artificial Intelligence, Human-machine cooperation as applied to drones

    Aerial Vehicles

    Get PDF
    This book contains 35 chapters written by experts in developing techniques for making aerial vehicles more intelligent, more reliable, more flexible in use, and safer in operation.It will also serve as an inspiration for further improvement of the design and application of aeral vehicles. The advanced techniques and research described here may also be applicable to other high-tech areas such as robotics, avionics, vetronics, and space
    corecore