1,430 research outputs found

    Insect inspired visual motion sensing and flying robots

    Get PDF
    International audienceFlying insects excellently master visual motion sensing techniques. They use dedicated motion processing circuits at a low energy and computational costs. Thanks to observations obtained on insect visual guidance, we developed visual motion sensors and bio-inspired autopilots dedicated to flying robots. Optic flow-based visuomotor control systems have been implemented on an increasingly large number of sighted autonomous robots. In this chapter, we present how we designed and constructed local motion sensors and how we implemented bio-inspired visual guidance scheme on-board several micro-aerial vehicles. An hyperacurate sensor in which retinal micro-scanning movements are performed via a small piezo-bender actuator was mounted onto a miniature aerial robot. The OSCAR II robot is able to track a moving target accurately by exploiting the microscan-ning movement imposed to its eye's retina. We also present two interdependent control schemes driving the eye in robot angular position and the robot's body angular position with respect to a visual target but without any knowledge of the robot's orientation in the global frame. This "steering-by-gazing" control strategy, which is implemented on this lightweight (100 g) miniature sighted aerial robot, demonstrates the effectiveness of this biomimetic visual/inertial heading control strategy

    Insect inspired behaviours for the autonomous control of mobile robots

    Full text link
    Animals navigate through various uncontrolled environments with seemingly little effort. Flying insects, especially, are quite adept at manoeuvring in complex, unpredictable and possibly hostile environments. Through both simulation and real-world experiments, we demonstrate the feasibility of equipping a mobile robot with the ability to navigate a corridor environment, in real time, using principles based on insect-based visual guidance. In particular we have used the bees&rsquo; navigational strategy of measuring object range in terms of image velocity. We have also shown the viability and usefulness of various other insect behaviours: (i) keeping walls equidistant, (ii) slowing down when approaching an object, (iii) regulating speed according to tunnel width, and (iv) using visual motion as a measure of distance travelled.<br /

    A lightweight, inexpensive robotic system for insect vision

    Get PDF
    Designing hardware for miniaturized robotics which mimics the capabilities of flying insects is of interest, because they share similar constraints (i.e. small size, low weight, and low energy consumption). Research in this area aims to enable robots with similarly efficient flight and cognitive abilities. Visual processing is important to flying insects' impressive flight capabilities, but currently, embodiment of insect-like visual systems is limited by the hardware systems available. Suitable hardware is either prohibitively expensive, difficult to reproduce, cannot accurately simulate insect vision characteristics, and/or is too heavy for small robotic platforms. These limitations hamper the development of platforms for embodiment which in turn hampers the progress on understanding of how biological systems fundamentally works. To address this gap, this paper proposes an inexpensive, lightweight robotic system for modelling insect vision. The system is mounted and tested on a robotic platform for mobile applications, and then the camera and insect vision models are evaluated. We analyse the potential of the system for use in embodiment of higher-level visual processes (i.e. motion detection) and also for development of navigation based on vision for robotics in general. Optic flow from sample camera data is calculated and compared to a perfect, simulated bee world showing an excellent resemblance

    Towards Computational Models and Applications of Insect Visual Systems for Motion Perception: A Review

    Get PDF
    Motion perception is a critical capability determining a variety of aspects of insects' life, including avoiding predators, foraging and so forth. A good number of motion detectors have been identified in the insects' visual pathways. Computational modelling of these motion detectors has not only been providing effective solutions to artificial intelligence, but also benefiting the understanding of complicated biological visual systems. These biological mechanisms through millions of years of evolutionary development will have formed solid modules for constructing dynamic vision systems for future intelligent machines. This article reviews the computational motion perception models originating from biological research of insects' visual systems in the literature. These motion perception models or neural networks comprise the looming sensitive neuronal models of lobula giant movement detectors (LGMDs) in locusts, the translation sensitive neural systems of direction selective neurons (DSNs) in fruit flies, bees and locusts, as well as the small target motion detectors (STMDs) in dragonflies and hover flies. We also review the applications of these models to robots and vehicles. Through these modelling studies, we summarise the methodologies that generate different direction and size selectivity in motion perception. At last, we discuss about multiple systems integration and hardware realisation of these bio-inspired motion perception models

    Biomimetic visual navigation in a corridor: to centre or not to centre?

    Get PDF
    International audienceAs a first step toward an Automatic Flight Control System (AFCS) for Micro-Air Vehicle (MAV) obstacle avoidance, we introduce a vision based autopilot (LORA: Lateral Optic flow Regulation Autopilot), which is able to make a hovercraft automatically follow a wall or centre between the two walls of a corridor. A hovercraft is endowed with natural stabilization in pitch and roll while keeping two translational degrees of freedom (X and Y) and one rotational degree of freedom (yaw Ψ). We show the feasibility of an OF regulator that maintains the lateral Optic Flow (OF) on one wall equal to an OF set-point. The OF sensors used are Elementary Motion Detectors (EMDs), whose working was directly inspired by the housefly motion detecting neurons. The properties of these neurons were previously analysed at our laboratory by performing electrophysiological recordings while applying optical microstimuli to single photoreceptor cells of the compound eye. The simulation results show that depending on the OF set-point, the hovercraft either centres along the midline of the corridor or follows one of the two walls, even with local lack of optical texture on one wall, such as caused, for instance, by an open door or a T-junction. All these navigational tasks are performed with one and the same feedback loop, which consists of a lateral OF regulation loop that permits relatively high-speed navigation (1m/s, i.e 3 body-lengths per second). The passive visual sensors and the simple processing system are suitable for use with MAVs with an avionic payload of only a few grams. The goal is to achieve MAV automatic guidance or to relieve a remote operator from guiding it in challenging environments such as urban canyons or indoor environments
    • …
    corecore