148 research outputs found

    Steering by Gazing: An Efficient Biomimetic Control Strategy for Visually-guided Micro-Air Vehicles

    No full text
    International audienceOSCAR 2 is a twin-engine aerial demonstrator equipped with a monocular visual system, which manages to keep its gaze and its heading steadily fixed on a target (a dark edge or a bar) in spite of the severe random perturbations applied to its body via a ducted fan. The tethered robot stabilizes its gaze on the basis of two Oculomotor Reflexes (ORs) inspired by studies on animals: - a Visual Fixation Reflex (VFR) - a Vestibulo-ocular Reflex (VOR) One of the key features of this robot is the fact that the eye is decoupled mechanically from the body about the vertical (yaw) axis. To meet the conflicting requirements of high accuracy and fast ocular responses, a miniature (2.4-gram) Voice Coil Motor (VCM) was used, which enables the eye to make a change of orientation within an unusually short rise time (19ms). The robot, which was equipped with a high bandwidth (7Hz) "Vestibulo-Ocular Reflex (VOR)" based on an inertial micro-rate gyro, is capable of accurate visual fixation as long as there is light. The robot is also able to pursue a moving target in the presence of erratic gusts of wind. Here we present the two interdependent control schemes driving the eye in the robot and the robot in space without any knowledge of the robot's angular position. This "steering by gazing" control strategy implemented on this lightweight (100-gram) miniature aerial robot demonstrates the effectiveness of this biomimetic visual/inertial heading control strategy

    A mouse sensor and a 2-pixel motion sensor exposed to continuous illuminance changes

    No full text
    International audienceConsiderable attention has been paid during the last decade to navigation systems based on the use of visual optic flow cues, especially for guiding autonomous robots designed to travel under specific lighting conditions. In the present study, the performances of two visual motion sensors used to measure a local 1-D angular speed, namely (i) a bio-inspired 2-pixel motion sensor and (ii) an off-the-shelf mouse sensor, were tested for the first time in a wide range of illuminance levels. The sensors' characteristics were determined here by recording their responses to a purely rotational optic flow generated by rotating the sensors mechanically and comparing their responses with an accurate rate gyro output signal. The refresh rate, a key parameter for future optic flow-based robotic applications, was also defined and tested in these two sensors. The bio-inspired 2-pixel motion sensor was found to be more accurate indoors whereas the mouse sensor was found to be more efficient outdoors

    The VODKA sensor: a bio-inspired hyperacute optical position sensing device

    Get PDF
    International audienceWe have designed and built a simple optical sensor called Vibrating Optical Device for the Kontrol of Autonomous robots (VODKA), that was inspired by the "tremor" eye movements observed in many vertebrate and invertebrate animals. In the initial version presented here, the sensor relies on the repetitive micro-translation of a pair of photoreceptors set behind a small lens, and on the processing designed to locate a target from the two photoreceptor signals. The VODKA sensor, in which retinal micro-scanning movements are performed via a small piezo-bender actuator driven at a frequency of 40Hz, was found to be able to locate a contrasting edge with an outstandingly high resolution 900-fold greater than its static resolution (which is constrained by the interreceptor angle), regardless of the scanning law imposed on the retina. Hyperacuity is thus obtained at a very low cost, thus opening new vistas for the accurate visuo-motor control of robotic platforms. As an example, the sensor was mounted onto a miniature aerial robot that became able to track a moving target accurately by exploiting the robot's uncontrolled random vibrations as the source of its ocular microscanning movement. The simplicity, small size, low mass and low power consumption of this optical sensor make it highly suitable for many applications in the fields of metrology, astronomy, robotics, automotive, and aerospace engineering. The basic operating principle may also shed new light on the whys and wherefores of the tremor eye movements occurring in both animals and humans

    A sighted aerial robot with fast gaze and heading stabilization

    No full text
    International audienceAutonomous guidance of Micro-Air Vehicles (MAVs) in unknown environments is a challenging task because these artificial creatures have small aeromechanical time constants, which make them prone to be disturbed by gusts of wind. Flying insects are subject to quite similar kinds of disturbances, yet they navigate swiftly and deftly. Flying insects display highperformance visuo-motor control systems that have stood the test of time. They can therefore teach us how vision can be used for immediate and vital actions. We built a 50-gram tethered aerial demonstrator, called OSCAR II, which manages to keep its gaze steadily fixating a target (a dark edge), in spite of nasty thumps that we deliberately gave to its body with a custom-made "slapping machine". The robot's agile yaw reactions are based on: - a mechanical decoupling of the eye from the body - an active coupling of the robot's heading with its gaze - a Visual Fixation Reflex (VFR) - a Vestibulo-Ocular Reflex (VOR) - an accurate and fast actuator (Voice Coil Motor, VCM) The actuator is a 2.4-gram voice coil motor that is able to rotate the eye with a rise time as small as 12ms, that is, much shorter than the rise time of human oculo-motor saccades. In connection with a micro-rate gyro, this actuator endows the robot with a high performance "vestibulo ocular reflex" that keeps the gaze locked onto the target whatever perturbations in yaw affect the robot's body. Whenever the robot is destabilized (e.g., by a slap applied on one side), the gaze keeps fixating the target, while being the reference to which the robot's heading is servoed. It then takes the robot only 0:6s to realign its heading with its gaze

    Exercices de Biorobotique aérienne

    Get PDF
    Tout engin autonome doté de vision, qu il soit sous-marin, terrestre, aérien ou spatial, génère lors de sa locomotion, des perturbations majeures affectant la ligne de regard de son système visuel. Ces perturbations peuvent être de deux types : - externes, du fait du milieu environnant : courant d air, rafales, vibrations, houle, courant marins, tempête de sable sur Mars... - internes, du fait des mouvements propres de l engin : rotation, battements d ailes ou d ailerons, changements délibérés de direction ou d attitude, vibrations... Si les systèmes de stabilisation du regard sont encore peu répandus en robotique autonome, l étude des animaux (vertébrés et invertébrés) montre que la nature a développé des trésors d ingéniosité qui permettent, par exemple, à un héron ou à une vulgaire mouche, de voler sans aucun crash, tout en maintenant leur système visuel parfaitement stabilisé par épisodes de plusieurs centaines de millisecondes. Ces êtres disposent alors d une information visuelle peu perturbée et par conséquent exploitable directement par les neurones dédiés au traitement des signaux visuels, en particulier du flux optique de translation. Les efforts développés par les animaux pour stabiliser leur ligne de regard nous poussent à reproduire de tels mécanismes à bord de plate-formes robotiques dotées de vision. On peut alors se demander quels sont les éléments primordiaux nécessaires à une stabilisation performante du regard. Chez les insectes ailés, qui servent de modèle à notre équipe depuis plusieurs décennies, il existe un découplage tête-corps fondamental qui, via des réflexes visuo-inertiels, permet un maintien de l orientation de la tête et ce, en dépit des mouvements de roulis du thorax qui produisent un déplacement latéral à la manière de l hélicoptère. Ces réflexes «oculomoteurs» se retrouvent aussi chez les vertébrés sous la forme du réflexe «vestibulo-oculaire» associé au réflexe de «fixation visuelle». Ce sont précisément ces types de réflexe que nous avons mis en oeuvre à bord d un démonstrateur aérien autonome

    Decoupling the Eye: A Key toward a Robust Hovering for Sighted Aerial Robots

    No full text
    International audienceInspired by natural visual systems where gaze stabilization is at a premium, we simulated an aerial robot with a decoupled eye to achieve more robust hovering above a ground target despite strong lateral and rotational disturbances. In this paper, two different robots are compared for the same disturbances and displacements. The first robot is equipped with a fixed eye featuring a large field-of-view (FOV) and the second robot is endowed with a decoupled eye featuring a small FOV (about ±5°). Even if this mechanical decoupling increases the mechanical complexity of the robot, this study demonstrates that disturbances are rejected faster and the computational complexity is clearly decreased. Thanks to bio-inspired visuo-motor reflexes, the decoupled eye robot is able to hold its gaze locked onto a distant target and to reject strong disturbances by profiting of the small inertia of the decoupled eye

    Neuromimetic Robots inspired by Insect Vision

    Get PDF
    International audienceEquipped with a less-than-one-milligram brain, insects fly autonomously in complex environments without resorting to any Radars, Ladars, Sonars or GPS. The knowledge gained during the last decades on insects' sensory-motor abilities and the neuronal substrates involved provides us with a rich source of inspiration for designing tomorrow's self-guided vehicles and micro-vehicles, which are to cope with unforeseen events on the ground, in the air, under water or in space. Insects have been in the business of sensory-motor integration for several 100 millions years and can therefore teach us useful tricks for designing agile autonomous vehicles at various scales. Constructing a "biorobot" first requires exactly formulating the signal processing principles at work in the animal. It gives us, in return, a unique opportunity of checking the soundness and robustness of those principles by bringing them face to face with the real physical world. Here we describe some of the visually-guided terrestrial and aerial robots we have developed on the basis of our biological findings. These robots (Robot Fly, SCANIA, FANIA, OSCAR, OCTAVE and LORA) all react to the optic flow (i.e., the angular speed of the retinal image). Optic flow is sensed onboard the robots by miniature vision sensors called Elementary Motion Detectors (EMDs). The principle of these electro-optical velocity sensors was derived from optical/electrophysiological studies where we recorded the responses of single neurons to optical microstimulation of single photoreceptor cells in a model visual system: the fly's compound eye. Optic flow based sensors rely solely on contrast provided by reflected (or scattered) sunlight from any kind of celestial bodies in a given spectral range. These nonemissive, powerlean sensors offer potential applications to manned or unmanned aircraft. Applications can also be envisaged to spacecraft, from robotic landers and rovers to asteroid explorers or space station dockers, with interesting prospects as regards reduction in weight and consumption

    Bio-inspired Landing Approaches and Their Potential Use On Extraterrestrial Bodies

    No full text
    International audienceAutomatic landing on extraterrestrial bodies is still a challenging and hazardous task. Here we propose a new type of autopilot designed to solve landing problems, which is based on neurophysiological, behavioral, and biorobotic findings on flying insects. Flying insects excel in optic flow sensing techniques and cope with highly parallel data at a low energy and computational cost using lightweight dedicated motion processing circuits. In the first part of this paper, we present our biomimetic approach in the context of a lunar landing scenario, assuming a 2-degree-of-freedom spacecraft approaching the moon, which is simulated with the PANGU software. The autopilot we propose relies only on optic flow (OF) and inertial measurements, and aims at regulating the OF generated during the landing approach, by means of a feedback control system whose sensor is an OF sensor. We put forward an estimation method based on a two-sensor setup to accurately estimate the orientation of the lander's velocity vector, which is mandatory to control the lander's pitch in a near optimal way with respect to the fuel consumption. In the second part, we present a lightweight Visual Motion Sensor (VMS) which draws on the results of neurophysiological studies on the insect visual system. The VMS was able to perform local 1-D angular speed measurements in the range 1.5°/s - 25°/s. The sensor was mounted on an 80 kg unmanned helicopter and test-flown outdoors over various fields. The OF measured onboard was shown to match the ground-truth optic flow despite the dramatic disturbances and vibrations experienced by the sensor

    Novel Hyperacute Gimbal Eye for Implementing Precise Hovering and Target Tracking on Board a Quadrotor

    No full text
    International audienceThis paper presents a new minimalist bio-inspired artificial eye of only 24 pixels, able to locate accurately a target placed in its small field of view (10°). The eye is mounted on a very light custom-made gimbal system which makes the eye able to track faithfully a moving target. We have shown here, that our gimbal eye can be embedded onboard a small quadrotor to achieve accurate hovering with respect to a target placed onto the ground. Our aiborne oculomotor system was enhanced with a bio-inspired reflexe in charge to lock efficiently the robot’s gaze onto a target and compensate for the robot’s rotations and disturbances. The use of very few pixels allowed to implement a visual processing algorithm at a refresh rate as high as such as 400 Hz. This high refresh rate coupled to a very fast control of the eye’s orientation allowed the robot to track efficiently a target moving at a speed up to 200°/s

    Toward a fully autonomous hovercraft visually guided thanks to its own bio-inspired motion sensors

    No full text
    Based on a biorobotic approach developed in our laboratory over the past 25 years, we have designed and built several terrestrial and aerial vehicles controlling their position and speed on the basis of optic flow cues. In particular, in our project on the autonomous guidance of Micro-Air Vehicles (MAVs) in confined indoor and outdoor environments, we have developed a vision-based autopilot, which is called LORA III (Lateral Optic flow Regulation Autopilot, Mark III). This autopilot, based on the dual optic flow regulation, allows an air vehicle to travel along a corridor by automatically controlling both its speed and its clearance from the walls. The optic flow regulation is a feedback control based on an optic flow sensor, which strives to maintain a perceived optic flow at a constant set-point by adjusting a thrust. The LORA III autopilot consists of a dual optic flow regulator in which each regulator has its own optic flow set-point and controls the robot's translation in one degree of freedom: a bilateral optic flow regulator controls the robot's forward speed, while a unilateral optic flow regulator controls the side thrust, making the robot avoid the walls of the corridor. This autopilot draws on former studies which aimed to understand how a honeybee might be able to center along a corridor, to follow a single wall, and to adjust its speed according to the corridor width. Computer-simulated experiments have shown that a miniature hovercraft equipped with the LORA III autopilot can navigate along a straight or tapered corridor at a relatively high speed (up to 1m/s). The minimalistic visual system (comprised of only four pixels) may suffice for the hovercraft to be able to control both its clearance from the walls and its forward speed jointly, without ever measuring speed or distance, in a similar manner to what honeybees are thought to be capable of. The LORA robot is equipped with two rear thrusters and two lateral thrusters, in addition to the lift fan used to inflate the skirt. The hovercraft can move freely without any umbilicus, which makes its system identification and its own locomotion easier. However, the dynamics of all five motors turned out to be highly sensitive to the drop in supply voltage of the onboard Lithium Polymer (Li-Po) batteries. This is a critical issue for the identification of the robot's dynamical parameters. To perform an efficient system identification of the hovercraft's dynamics, we decided to confer upon each motor a dedicated controller that would make the four thrusters and the lift fan robust to any variations in the battery supply voltage
    • …
    corecore