197 research outputs found

    Evidence for ventral optic flow regulation in honeybees

    No full text
    To better grasp the visuomotor control system underlying insects' height and speed control, we attempted to interfere with this system by producing a major perturbation on the free flying insect and obsering the effect of this perturbation. Honeybees were trained to fly along a high-roofed tunnel, part of which was equipped with a moving floor. The bees followed the stationary part of the floor at a given height. On encountering the moving part of the floor, which moved in the same direction as their flight, honeybees descended and flew at a lower height. In so doing, bees gradually restored their ventral optic flow (OF) to a similar value to that they had perceived when flying over the stationary part of the floor. OF restoration therefore relied on lowering the groundheight rather than increasing the groundspeed. This result can be accounted for by the control system called an optic flow regulator that we proposed in previous studies. This visuo-motor control scheme explains how honeybees can navigate safely along tunnels on the sole basis of OF measurements, without any need to measure either their speed or the clearance from the ground, the roof or the surrounding walls

    A sighted aerial robot with fast gaze and heading stabilization

    No full text
    International audienceAutonomous guidance of Micro-Air Vehicles (MAVs) in unknown environments is a challenging task because these artificial creatures have small aeromechanical time constants, which make them prone to be disturbed by gusts of wind. Flying insects are subject to quite similar kinds of disturbances, yet they navigate swiftly and deftly. Flying insects display highperformance visuo-motor control systems that have stood the test of time. They can therefore teach us how vision can be used for immediate and vital actions. We built a 50-gram tethered aerial demonstrator, called OSCAR II, which manages to keep its gaze steadily fixating a target (a dark edge), in spite of nasty thumps that we deliberately gave to its body with a custom-made "slapping machine". The robot's agile yaw reactions are based on: - a mechanical decoupling of the eye from the body - an active coupling of the robot's heading with its gaze - a Visual Fixation Reflex (VFR) - a Vestibulo-Ocular Reflex (VOR) - an accurate and fast actuator (Voice Coil Motor, VCM) The actuator is a 2.4-gram voice coil motor that is able to rotate the eye with a rise time as small as 12ms, that is, much shorter than the rise time of human oculo-motor saccades. In connection with a micro-rate gyro, this actuator endows the robot with a high performance "vestibulo ocular reflex" that keeps the gaze locked onto the target whatever perturbations in yaw affect the robot's body. Whenever the robot is destabilized (e.g., by a slap applied on one side), the gaze keeps fixating the target, while being the reference to which the robot's heading is servoed. It then takes the robot only 0:6s to realign its heading with its gaze

    The VODKA sensor: a bio-inspired hyperacute optical position sensing device

    Get PDF
    International audienceWe have designed and built a simple optical sensor called Vibrating Optical Device for the Kontrol of Autonomous robots (VODKA), that was inspired by the "tremor" eye movements observed in many vertebrate and invertebrate animals. In the initial version presented here, the sensor relies on the repetitive micro-translation of a pair of photoreceptors set behind a small lens, and on the processing designed to locate a target from the two photoreceptor signals. The VODKA sensor, in which retinal micro-scanning movements are performed via a small piezo-bender actuator driven at a frequency of 40Hz, was found to be able to locate a contrasting edge with an outstandingly high resolution 900-fold greater than its static resolution (which is constrained by the interreceptor angle), regardless of the scanning law imposed on the retina. Hyperacuity is thus obtained at a very low cost, thus opening new vistas for the accurate visuo-motor control of robotic platforms. As an example, the sensor was mounted onto a miniature aerial robot that became able to track a moving target accurately by exploiting the robot's uncontrolled random vibrations as the source of its ocular microscanning movement. The simplicity, small size, low mass and low power consumption of this optical sensor make it highly suitable for many applications in the fields of metrology, astronomy, robotics, automotive, and aerospace engineering. The basic operating principle may also shed new light on the whys and wherefores of the tremor eye movements occurring in both animals and humans

    Steering by Gazing: An Efficient Biomimetic Control Strategy for Visually-guided Micro-Air Vehicles

    No full text
    International audienceOSCAR 2 is a twin-engine aerial demonstrator equipped with a monocular visual system, which manages to keep its gaze and its heading steadily fixed on a target (a dark edge or a bar) in spite of the severe random perturbations applied to its body via a ducted fan. The tethered robot stabilizes its gaze on the basis of two Oculomotor Reflexes (ORs) inspired by studies on animals: - a Visual Fixation Reflex (VFR) - a Vestibulo-ocular Reflex (VOR) One of the key features of this robot is the fact that the eye is decoupled mechanically from the body about the vertical (yaw) axis. To meet the conflicting requirements of high accuracy and fast ocular responses, a miniature (2.4-gram) Voice Coil Motor (VCM) was used, which enables the eye to make a change of orientation within an unusually short rise time (19ms). The robot, which was equipped with a high bandwidth (7Hz) "Vestibulo-Ocular Reflex (VOR)" based on an inertial micro-rate gyro, is capable of accurate visual fixation as long as there is light. The robot is also able to pursue a moving target in the presence of erratic gusts of wind. Here we present the two interdependent control schemes driving the eye in the robot and the robot in space without any knowledge of the robot's angular position. This "steering by gazing" control strategy implemented on this lightweight (100-gram) miniature aerial robot demonstrates the effectiveness of this biomimetic visual/inertial heading control strategy

    Biomimetic visual navigation in a corridor: to centre or not to centre?

    Get PDF
    International audienceAs a first step toward an Automatic Flight Control System (AFCS) for Micro-Air Vehicle (MAV) obstacle avoidance, we introduce a vision based autopilot (LORA: Lateral Optic flow Regulation Autopilot), which is able to make a hovercraft automatically follow a wall or centre between the two walls of a corridor. A hovercraft is endowed with natural stabilization in pitch and roll while keeping two translational degrees of freedom (X and Y) and one rotational degree of freedom (yaw Ψ). We show the feasibility of an OF regulator that maintains the lateral Optic Flow (OF) on one wall equal to an OF set-point. The OF sensors used are Elementary Motion Detectors (EMDs), whose working was directly inspired by the housefly motion detecting neurons. The properties of these neurons were previously analysed at our laboratory by performing electrophysiological recordings while applying optical microstimuli to single photoreceptor cells of the compound eye. The simulation results show that depending on the OF set-point, the hovercraft either centres along the midline of the corridor or follows one of the two walls, even with local lack of optical texture on one wall, such as caused, for instance, by an open door or a T-junction. All these navigational tasks are performed with one and the same feedback loop, which consists of a lateral OF regulation loop that permits relatively high-speed navigation (1m/s, i.e 3 body-lengths per second). The passive visual sensors and the simple processing system are suitable for use with MAVs with an avionic payload of only a few grams. The goal is to achieve MAV automatic guidance or to relieve a remote operator from guiding it in challenging environments such as urban canyons or indoor environments

    Optic Flow Based Autopilots: Speed Control and Obstacle Avoidance

    Get PDF
    International audienceThe explicit control schemes presented here explain how insects may navigate on the sole basis of optic flow (OF) cues without requiring any distance or speed measurements: how they take off and land, follow the terrain, avoid the lateral walls in a corridor and control their forward speed automatically. The optic flow regulator, a feedback system controlling either the lift, the forward thrust or the lateral thrust, is described. Three OF regulators account for various insect flight patterns observed over the ground and over still water, under calm and windy conditions and in straight and tapered corridors. These control schemes were simulated experimentally and/or implemented onboard two types of aerial robots, a micro helicopter (MH) and a hovercraft (HO), which behaved much like insects when placed in similar environments. These robots were equipped with opto-electronic OF sensors inspired by our electrophysiological findings on houseflies' motion sensitive visual neurons. The simple, parsimonious control schemes described here require no conventional avionic devices such as range finders, groundspeed sensors or GPS receivers. They are consistent with the the neural repertoire of flying insects and meet the low avionic payload requirements of autonomous micro aerial and space vehicles

    Bionics of visuo-motor control

    Get PDF
    International audienc

    Neuromimetic Robots inspired by Insect Vision

    Get PDF
    International audienceEquipped with a less-than-one-milligram brain, insects fly autonomously in complex environments without resorting to any Radars, Ladars, Sonars or GPS. The knowledge gained during the last decades on insects' sensory-motor abilities and the neuronal substrates involved provides us with a rich source of inspiration for designing tomorrow's self-guided vehicles and micro-vehicles, which are to cope with unforeseen events on the ground, in the air, under water or in space. Insects have been in the business of sensory-motor integration for several 100 millions years and can therefore teach us useful tricks for designing agile autonomous vehicles at various scales. Constructing a "biorobot" first requires exactly formulating the signal processing principles at work in the animal. It gives us, in return, a unique opportunity of checking the soundness and robustness of those principles by bringing them face to face with the real physical world. Here we describe some of the visually-guided terrestrial and aerial robots we have developed on the basis of our biological findings. These robots (Robot Fly, SCANIA, FANIA, OSCAR, OCTAVE and LORA) all react to the optic flow (i.e., the angular speed of the retinal image). Optic flow is sensed onboard the robots by miniature vision sensors called Elementary Motion Detectors (EMDs). The principle of these electro-optical velocity sensors was derived from optical/electrophysiological studies where we recorded the responses of single neurons to optical microstimulation of single photoreceptor cells in a model visual system: the fly's compound eye. Optic flow based sensors rely solely on contrast provided by reflected (or scattered) sunlight from any kind of celestial bodies in a given spectral range. These nonemissive, powerlean sensors offer potential applications to manned or unmanned aircraft. Applications can also be envisaged to spacecraft, from robotic landers and rovers to asteroid explorers or space station dockers, with interesting prospects as regards reduction in weight and consumption
    • …
    corecore