9,369 research outputs found

    A model of ant route navigation driven by scene familiarity

    Get PDF
    In this paper we propose a model of visually guided route navigation in ants that captures the known properties of real behaviour whilst retaining mechanistic simplicity and thus biological plausibility. For an ant, the coupling of movement and viewing direction means that a familiar view specifies a familiar direction of movement. Since the views experienced along a habitual route will be more familiar, route navigation can be re-cast as a search for familiar views. This search can be performed with a simple scanning routine, a behaviour that ants have been observed to perform. We test this proposed route navigation strategy in simulation, by learning a series of routes through visually cluttered environments consisting of objects that are only distinguishable as silhouettes against the sky. In the first instance we determine view familiarity by exhaustive comparison with the set of views experienced during training. In further experiments we train an artificial neural network to perform familiarity discrimination using the training views. Our results indicate that, not only is the approach successful, but also that the routes that are learnt show many of the characteristics of the routes of desert ants. As such, we believe the model represents the only detailed and complete model of insect route guidance to date. What is more, the model provides a general demonstration that visually guided routes can be produced with parsimonious mechanisms that do not specify when or what to learn, nor separate routes into sequences of waypoints

    Redundant neural vision systems: competing for collision recognition roles

    Get PDF
    Ability to detect collisions is vital for future robots that interact with humans in complex visual environments. Lobula giant movement detectors (LGMD) and directional selective neurons (DSNs) are two types of identified neurons found in the visual pathways of insects such as locusts. Recent modelling studies showed that the LGMD or grouped DSNs could each be tuned for collision recognition. In both biological and artificial vision systems, however, which one should play the collision recognition role and the way the two types of specialized visual neurons could be functioning together are not clear. In this modeling study, we compared the competence of the LGMD and the DSNs, and also investigate the cooperation of the two neural vision systems for collision recognition via artificial evolution. We implemented three types of collision recognition neural subsystems – the LGMD, the DSNs and a hybrid system which combines the LGMD and the DSNs subsystems together, in each individual agent. A switch gene determines which of the three redundant neural subsystems plays the collision recognition role. We found that, in both robotics and driving environments, the LGMD was able to build up its ability for collision recognition quickly and robustly therefore reducing the chance of other types of neural networks to play the same role. The results suggest that the LGMD neural network could be the ideal model to be realized in hardware for collision recognition

    The Evolution of Diversity

    Get PDF
    Since the beginning of time, the pre-biological and the biological world have seen a steady increase in complexity of form and function based on a process of combination and re-combination. The current modern synthesis of evolution known as the neo-Darwinian theory emphasises population genetics and does not explain satisfactorily all other occurrences of evolutionary novelty. The authors suggest that symbiosis and hybridisation and the more obscure processes such as polyploidy, chimerism and lateral transfer are mostly overlooked and not featured sufficiently within evolutionary theory. They suggest, therefore, a revision of the existing theory including its language, to accommodate the scientific findings of recent decades

    Neuromimetic Robots inspired by Insect Vision

    Get PDF
    International audienceEquipped with a less-than-one-milligram brain, insects fly autonomously in complex environments without resorting to any Radars, Ladars, Sonars or GPS. The knowledge gained during the last decades on insects' sensory-motor abilities and the neuronal substrates involved provides us with a rich source of inspiration for designing tomorrow's self-guided vehicles and micro-vehicles, which are to cope with unforeseen events on the ground, in the air, under water or in space. Insects have been in the business of sensory-motor integration for several 100 millions years and can therefore teach us useful tricks for designing agile autonomous vehicles at various scales. Constructing a "biorobot" first requires exactly formulating the signal processing principles at work in the animal. It gives us, in return, a unique opportunity of checking the soundness and robustness of those principles by bringing them face to face with the real physical world. Here we describe some of the visually-guided terrestrial and aerial robots we have developed on the basis of our biological findings. These robots (Robot Fly, SCANIA, FANIA, OSCAR, OCTAVE and LORA) all react to the optic flow (i.e., the angular speed of the retinal image). Optic flow is sensed onboard the robots by miniature vision sensors called Elementary Motion Detectors (EMDs). The principle of these electro-optical velocity sensors was derived from optical/electrophysiological studies where we recorded the responses of single neurons to optical microstimulation of single photoreceptor cells in a model visual system: the fly's compound eye. Optic flow based sensors rely solely on contrast provided by reflected (or scattered) sunlight from any kind of celestial bodies in a given spectral range. These nonemissive, powerlean sensors offer potential applications to manned or unmanned aircraft. Applications can also be envisaged to spacecraft, from robotic landers and rovers to asteroid explorers or space station dockers, with interesting prospects as regards reduction in weight and consumption
    corecore