46 research outputs found

    A Model for an Angular Velocity-Tuned Motion Detector Accounting for Deviations in the Corridor-Centering Response of the Bee

    Get PDF
    We present a novel neurally based model for estimating angular velocity (AV) in the bee brain, capable of quantitatively reproducing experimental observations of visual odometry and corridor-centering in free-flying honeybees, including previously unaccounted for manipulations of behaviour. The model is fitted using electrophysiological data, and tested using behavioural data. Based on our model we suggest that the AV response can be considered as an evolutionary extension to the optomotor response. The detector is tested behaviourally in silico with the corridor-centering paradigm, where bees navigate down a corridor with gratings (square wave or sinusoidal) on the walls. When combined with an existing flight control algorithm the detector reproduces the invariance of the average flight path to the spatial frequency and contrast of the gratings, including deviations from perfect centering behaviour as found in the real bee's behaviour. In addition, the summed response of the detector to a unit distance movement along the corridor is constant for a large range of grating spatial frequencies, demonstrating that the detector can be used as a visual odometer

    Bio-inspired Neural Networks for Angular Velocity Estimation in Visually Guided Flights

    Get PDF
    Executing delicate flight maneuvers using visual information is a huge challenge for future robotic vision systems. As a source of inspiration, insects are quite apt at navigating in woods and landing on surfaces which require delicate visual perception and flight control. The exquisite sensitivity of insects for image motion speed, as revealed recently, is coming from a class of specific neurons called descending neurons. Some of the descending neurons have demonstrated angular velocity selectivity as the image motion speed varies in retina. Build a quantitative angular velocity detection model is the first step for not only further understanding of the biological visual system, but also providing robust and economic solutions of visual motion perception for an artificial visual system. This thesis aims to explore biological image processing methods for motion speed detection in visually guided flights. The major contributions are summarized as follows. We have presented an angular velocity decoding model (AVDM), which estimates the visual motion speed combining both textural and temporal information from input signals. The model consists of three parts: elementary motion detection circuits, wide-field texture estimation pathway and angular velocity decoding layer. The model estimates the angular velocity very well with improved spatial frequency independence compared to the state-of-the-art angular velocity detecting models, when firstly tested by moving sinusoidal gratings. This spatial independence is vital to account for the honeybee’s flight behaviors. We have also investigated the spatial and temporal resolutions of honeybees to get a bio-plausible parameter setting for explaining these behaviors. To investigate whether the model can account for observations of tunnel centering behaviors of honeybees, the model has been implemented in a virtual bee simulated by the game engine Unity. The simulation results of a series of experiments show that the agent can adjust its position to fly through patterned tunnels by balancing the angular velocities estimated on both eyes under several circumstances. All tunnel stimulations reproduce similar behaviors of real bees, which indicate that our model does provide a possible explanation for estimating the image velocity and can be used for MAV’s flight course regulation in tunnels. What’s more, to further verify the robustness of the model, the visually guided terrain following simulations have been carried out with a closed-loop control scheme to restore a preset angular velocity during the flight. The simulation results of successfully flying over the undulating terrain verify the feasibility and robustness of the AVDM performing in various application scenarios, which shows its potential in applications of micro aerial vehicle’s terrain following. In addition, we have also applied the AVDM in grazing landing using only visual information. A LGMD neuron is also introduced to avoid collision and to trigger the hover phase, which ensures the safety of landing. By applying honeybee’s landing strategy of keeping constant angular velocity, we have designed a close-loop control scheme with an adaptive gain to control landing dynamic using AVDM response as input. A series of controlled trails have been designed in Unity platform to demonstrate the effectiveness of the proposed model and control scheme for visual landing under various conditions. The proposed model could be implemented into real small robots to investigate the robustness in real landing scenarios in near future

    A bioinspired angular velocity decoding neural network model for visually guided flights

    Get PDF
    Efficient and robust motion perception systems are important pre-requisites for achieving visually guided flights in future micro air vehicles. As a source of inspiration, the visual neural networks of flying insects such as honeybee and Drosophila provide ideal examples on which to base artificial motion perception models. In this paper, we have used this approach to develop a novel method that solves the fundamental problem of estimating angular velocity for visually guided flights. Compared with previous models, our elementary motion detector (EMD) based model uses a separate texture estimation pathway to effectively decode angular velocity, and demonstrates considerable independence from the spatial frequency and contrast of the gratings. Using the Unity development platform the model is further tested for tunnel centering and terrain following paradigms in order to reproduce the visually guided flight behaviors of honeybees. In a series of controlled trials, the virtual bee utilizes the proposed angular velocity control schemes to accurately navigate through a patterned tunnel, maintaining a suitable distance from the undulating textured terrain. The results are consistent with both neuron spike recordings and behavioral path recordings of real honeybees, thereby demonstrating the model’s potential for implementation in micro air vehicles which have only visual sensors

    Constant Angular Velocity Regulation for Visually Guided Terrain Following

    Get PDF
    Insects use visual cues to control their flight behaviours. By estimating the angular velocity of the visual stimuli and regulating it to a constant value, honeybees can perform a terrain following task which keeps the certain height above the undulated ground. For mimicking this behaviour in a bio-plausible computation structure, this paper presents a new angular velocity decoding model based on the honeybee's behavioural experiments. The model consists of three parts, the texture estimation layer for spatial information extraction, the motion detection layer for temporal information extraction and the decoding layer combining information from pervious layers to estimate the angular velocity. Compared to previous methods on this field, the proposed model produces responses largely independent of the spatial frequency and contrast in grating experiments. The angular velocity based control scheme is proposed to implement the model into a bee simulated by the game engine Unity. The perfect terrain following above patterned ground and successfully flying over irregular textured terrain show its potential for micro unmanned aerial vehicles' terrain following

    Finding the Gap:Neuromorphic Motion Vision in Cluttered Environments

    Get PDF
    Many animals meander in environments and avoid collisions. How the underlying neuronal machinery can yield robust behaviour in a variety of environments remains unclear. In the fly brain, motion-sensitive neurons indicate the presence of nearby objects and directional cues are integrated within an area known as the central complex. Such neuronal machinery, in contrast with the traditional stream-based approach to signal processing, uses an event-based approach, with events occurring when changes are sensed by the animal. Contrary to von Neumann computing architectures, event-based neuromorphic hardware is designed to process information in an asynchronous and distributed manner. Inspired by the fly brain, we model, for the first time, a neuromorphic closed-loop system mimicking essential behaviours observed in flying insects, such as meandering in clutter and gap crossing, which are highly relevant for autonomous vehicles. We implemented our system both in software and on neuromorphic hardware. While moving through an environment, our agent perceives changes in its surroundings and uses this information for collision avoidance. The agent's manoeuvres result from a closed action-perception loop implementing probabilistic decision-making processes. This loop-closure is thought to have driven the development of neural circuitry in biological agents since the Cambrian explosion. In the fundamental quest to understand neural computation in artificial agents, we come closer to understanding and modelling biological intelligence by closing the loop also in neuromorphic systems. As a closed-loop system, our system deepens our understanding of processing in neural networks and computations in biological and artificial systems. With these investigations, we aim to set the foundations for neuromorphic intelligence in the future, moving towards leveraging the full potential of neuromorphic systems.Comment: 7 main pages with two figures, including appendix 26 pages with 14 figure

    An Inexpensive Flying Robot Design for Embodied Robotics Research

    Get PDF
    Flying insects are capable of a wide-range of flight and cognitive behaviors which are not currently understood. The replication of these capabilities is of interest to miniaturized robotics, because they share similar size, weight, and energy constraints. Currently, embodiment of insect behavior is primarily done on ground robots which utilize simplistic sensors and have different constraints to flying insects. This limits how much progress can be made on understanding how biological systems fundamentally work. To address this gap, we have developed an inexpensive robotic solution in the form of a quadcopter aptly named BeeBot. Our work shows that BeeBot can support the necessary payload to replicate the sensing capabilities which are vital to bees' flight navigation, including chemical sensing and a wide visual field-of-view. BeeBot is controlled wirelessly in order to process this sensor data off-board; for example, in neural networks. Our results demonstrate the suitability of the proposed approach for further study of the development of navigation algorithms and of embodiment of insect cognition

    A Model for Detection of Angular Velocity of Image Motion Based on the Temporal Tuning of the Drosophila

    Get PDF
    We propose a new bio-plausible model based on the visual systems of Drosophila for estimating angular velocity of image motion in insects’ eyes. The model implements both preferred direction motion enhancement and non-preferred direction motion suppression which is discovered in Drosophila’s visual neural circuits recently to give a stronger directional selectivity. In addition, the angular velocity detecting model (AVDM) produces a response largely independent of the spatial frequency in grating experiments which enables insects to estimate the flight speed in cluttered environments. This also coincides with the behaviour experiments of honeybee flying through tunnels with stripes of different spatial frequencies

    A lightweight, inexpensive robotic system for insect vision

    Get PDF
    Designing hardware for miniaturized robotics which mimics the capabilities of flying insects is of interest, because they share similar constraints (i.e. small size, low weight, and low energy consumption). Research in this area aims to enable robots with similarly efficient flight and cognitive abilities. Visual processing is important to flying insects' impressive flight capabilities, but currently, embodiment of insect-like visual systems is limited by the hardware systems available. Suitable hardware is either prohibitively expensive, difficult to reproduce, cannot accurately simulate insect vision characteristics, and/or is too heavy for small robotic platforms. These limitations hamper the development of platforms for embodiment which in turn hampers the progress on understanding of how biological systems fundamentally works. To address this gap, this paper proposes an inexpensive, lightweight robotic system for modelling insect vision. The system is mounted and tested on a robotic platform for mobile applications, and then the camera and insect vision models are evaluated. We analyse the potential of the system for use in embodiment of higher-level visual processes (i.e. motion detection) and also for development of navigation based on vision for robotics in general. Optic flow from sample camera data is calculated and compared to a perfect, simulated bee world showing an excellent resemblance

    From insects to robots

    Get PDF

    A computational model of the integration of landmarks and motion in the insect central complex.

    Get PDF
    The insect central complex (CX) is an enigmatic structure whose computational function has evaded inquiry, but has been implicated in a wide range of behaviours. Recent experimental evidence from the fruit fly (Drosophila melanogaster) and the cockroach (Blaberus discoidalis) has demonstrated the existence of neural activity corresponding to the animal's orientation within a virtual arena (a neural 'compass'), and this provides an insight into one component of the CX structure. There are two key features of the compass activity: an offset between the angle represented by the compass and the true angular position of visual features in the arena, and the remapping of the 270° visual arena onto an entire circle of neurons in the compass. Here we present a computational model which can reproduce this experimental evidence in detail, and predicts the computational mechanisms that underlie the data. We predict that both the offset and remapping of the fly's orientation onto the neural compass can be explained by plasticity in the synaptic weights between segments of the visual field and the neurons representing orientation. Furthermore, we predict that this learning is reliant on the existence of neural pathways that detect rotational motion across the whole visual field and uses this rotation signal to drive the rotation of activity in a neural ring attractor. Our model also reproduces the 'transitioning' between visual landmarks seen when rotationally symmetric landmarks are presented. This model can provide the basis for further investigation into the role of the central complex, which promises to be a key structure for understanding insect behaviour, as well as suggesting approaches towards creating fully autonomous robotic agents
    corecore