27 research outputs found
Towards Computational Models and Applications of Insect Visual Systems for Motion Perception: A Review
Motion perception is a critical capability determining a variety of aspects of insects' life, including avoiding predators, foraging and so forth. A good number of motion detectors have been identified in the insects' visual pathways. Computational modelling of these motion detectors has not only been providing effective solutions to artificial intelligence, but also benefiting the understanding of complicated biological visual systems. These biological mechanisms through millions of years of evolutionary development will have formed solid modules for constructing dynamic vision systems for future intelligent machines. This article reviews the computational motion perception models originating from biological research of insects' visual systems in the literature. These motion perception models or neural networks comprise the looming sensitive neuronal models of lobula giant movement detectors (LGMDs) in locusts, the translation sensitive neural systems of direction selective neurons (DSNs) in fruit flies, bees and locusts, as well as the small target motion detectors (STMDs) in dragonflies and hover flies. We also review the applications of these models to robots and vehicles. Through these modelling studies, we summarise the methodologies that generate different direction and size selectivity in motion perception. At last, we discuss about multiple systems integration and hardware realisation of these bio-inspired motion perception models
Analysis of the neural circuit underlying the detection of visual motion in Drosophila melanogaster.
Recommended from our members
Descending premotor target tracking systems in flying insects
The control of behaviour in all animals requires efficient transformation of sensory signals into the task-specific activation of muscles. Predation offers an advantageous model behaviour to study the computational organisation underlying sensorimotor control. Predators are optimised through diverse evolutionary arms races to outperform their prey in terms of sensorimotor coordination, leading to highly specialised anatomical adaptations and hunting behaviours, which are often innate and highly stereotyped. Predatory flying insects present an extreme example, performing complex visually-guided pursuits of small, often fast flying prey over extremely small timescales. Furthermore, this behaviour is controlled by a tiny nervous system, leading to pressure on neuronal organisation to be optimised for coding efficiency.
In dragonflies, a population of eight pairs of bilaterally symmetric Target Selective Descending Neurons (TSDNs) relay visual information about small moving objects from the brain to the thoracic motor centres. These neurons encode the movement of small moving objects across the dorsal fovea region of the eye which is fixated on prey during predatory pursuit, and are thought to constitute the commands necessary for actuating an interception flight path. TSDNs are characterised by their receptive fields, with responses of each TSDN type spatially confined to a specific portion of the dorsal fovea visual field and tuned to a specific direction of object motion. To date, little is known about the descending representations mediating target tracking in other insects. This dissertation presents a comparative report of descending neurons in a variety of flying insects. The results are organised into three chapters:
Chapter 3 identifies TSDNs in demoiselle damselflies and compares their response properties to those previously described in dragonflies. Demoiselle TSDNs are also found to integrate binocular information, which is further elaborated with prism and eyepatch experiments.
Chapter 4 describes TSDNs in two dipteran species, the robberfly Holcocephala fusca and the killerfly Coenosia attenuata.
Chapter 5 describes an interaction between small- and wide-field visual features in TSDNs of both predatory and nonpredatory dipterans, finding functional similarity of these neurons for prey capture and conspecific pursuit. Dipteran TSDN responses are repressed by background motion in a direction dependent manner, suggesting a control architecture in which target tracking and optomotor stabilization pathways operate in parallel during pursuit.echnology and Biological Sciences ResearchCouncil (BB/M011194/1
Insect-Inspired Visual Perception for Flight Control and Collision Avoidance
Flying robots are increasingly used for tasks such as aerial mapping, fast exploration, video footage and monitoring of buildings.
Autonomous flight at low altitude in cluttered and unknown environments is an active research topic because it poses challenging perception and control problems.
Traditional methods for collision-free navigation at low altitude require heavy resources to deal with the complexity of natural environments, something that limits the autonomy and the payload of flying robots.
Flying insects, however, are able to navigate safely and efficiently using vision as the main sensory modality.
Flying insects rely on low resolution, high refresh rate, and wide-angle compound eyes to extract angular image motion and move in unstructured environments.
These strategies result in systems that are physically and computationally lighter than those often found in high-definition stereovision.
Taking inspiration from insects offers great potential for building small flying robots capable of navigating in cluttered environments using lightweight vision sensors.
In this thesis, we investigate insect perception of visual motion and insect vision based flight control in cluttered environments.
We use the knowledge gained through the modelling of neural circuits and behavioural experiments to develop flying robots with insect-inspired control strategies for goal-oriented navigation in complex environments.
We start by exploring insect perception of visual motion.
We present a study that reconciles an apparent contradiction in the literature for insect visual control: current models developed to explain insect flight behaviour rely on the measurement of optic flow, however the most prominent neural model for visual motion extraction (the Elementary Motion Detector, or EMD) does not measure optic flow.
We propose a model for unbiased optic flow estimation that relies on comparing the output of multiple EMDs pointed in varying viewing directions.
Our model is of interest of both engineers and biologists because it is computationally more efficient than other optic flow estimation algorithms, and because it represents a biologically plausible model for optic flow extraction in insect neural systems.
We then focus on insect flight control strategies in the presence of obstacles.
By recording the trajectories of bumblebees (Bombus terrestris), and by comparing them to simulated flights, we show that bumblebees rely primarily on the frontal part of their field of view, and that they pool optic flow in two different manners for the control of flight speed and of lateral position.
For the control of lateral position, our results suggest that bumblebees selectively react to the portions of the visual field where optic flow is the highest, which correspond to the closest obstacles.
Finally, we tackle goal-oriented navigation with a novel algorithm that combines aspects of insect perception and flight control presented in this thesis -- like the detection of fastest moving objects in the frontal visual field -- with other aspects of insect flight known from the literature such as saccadic flight pattern.
Through simulations, we demonstrate autonomous navigation in forest-like environments using only local optic flow information and assuming knowledge about the direction to the navigation goal
Biologically-Inspired Low-Light Vision Systems for Micro-Air Vehicle Applications
Various insect species such as the Megalopta genalis are able to visually stabilize and navigate at light levels in which individual photo-receptors may receive fewer than ten photons per second. They do so in cluttered forest environments with astonishing success while relying heavily on optic flow estimation. Such capabilities are nowhere near being met with current technology, in large part due to limitations of low-light vision systems.
This dissertation presents a body of work that enhances the capabilities of visual sensing in photon-limited environments with an emphasis on low-light optic flow detection. We discuss the design and characterization of two optical sensors fabricated using complementary metal-oxide-semiconductor (CMOS) very large scale integration (VLSI) technology. The first is a frame-based, low-light, photon-counting camera module with which we demonstrate 1-D non-directional optic flow detection with fewer than 100 photons/pixel/frame. The second utilizes adaptive analog circuits to improve room-temperature short-wave infrared sensing capabilities. This work demonstrates a reduction in dark current of nearly two orders of magnitude and an improvement in signal-to-noise ratio of nearly 40dB when compared to similar, non-adaptive circuits. This dissertation also presents a novel simulation-based framework that enables benchmarking of optic flow algorithms in photon-limited environments. Using this framework we compare the performance of traditional optic flow processing algorithms to biologically-inspired algorithms thought to be used by flying insects such as the Megalopta genalis. This work serves to provide an understanding of what may be ultimately possible with optic flow sensors in low-light environments and informs the design of future low-light optic flow hardware