212 research outputs found

    Flying insect inspired vision for autonomous aerial robot maneuvers in near-earth environments

    Get PDF
    Proceedings of the 2004 IEEE International Conference on Robotics & Automation. Retrieved April 2006 from http://prism2.mem.drexel.edu/~paul/papers/greenOhBarrowsIcra2004.pdfNear-Earth environments are time consuming, labor intensive and possibly dangerous to safe guard. Accomplishing tasks like bomb detection, search-andrescue and reconnaissance with aerial robots could save resources. This paper describes the adoption of insect behavior and flight patterns to devolop a AtAV sensor suite. A prototype called CQAR: Closed Quarter Aerial Robot, which is capable of flying in and around buildings, through tunnels and in and out of caves will be used to validate the eficiency of such a method when equipped with optic flow microsensors

    Taking Inspiration from Flying Insects to Navigate inside Buildings

    Get PDF
    These days, flying insects are seen as genuinely agile micro air vehicles fitted with smart sensors and also parsimonious in their use of brain resources. They are able to visually navigate in unpredictable and GPS-denied environments. Understanding how such tiny animals work would help engineers to figure out different issues relating to drone miniaturization and navigation inside buildings. To turn a drone of ~1 kg into a robot, miniaturized conventional avionics can be employed; however, this results in a loss of their flight autonomy. On the other hand, to turn a drone of a mass between ~1 g (or less) and ~500 g into a robot requires an innovative approach taking inspiration from flying insects both with regard to their flapping wing propulsion system and their sensory system based mainly on motion vision in order to avoid obstacles in three dimensions or to navigate on the basis of visual cues. This chapter will provide a snapshot of the current state of the art in the field of bioinspired optic flow sensors and optic flow-based direct feedback loops applied to micro air vehicles flying inside buildings

    The Vertical Optic Flow: An Additional Cue for Stabilizing Beerotor Robot’s Flight Without IMU

    No full text
    International audienceBio-inspired guidance principles involving no reference frame are presented here and were implemented in a rotorcraft called Beerotor, which was equipped with a minimalistic panoramic optic flow sensor and no accelerometer, no inertial measurement unit (IMU) [9], as in flying insects (Dipterian only uses rotation rates). In the present paper, the vertical optic flow was used as an additional cue whereas the previously published Beerotor II's visuo-motor system only used translational op-tic flow cues [9]. To test these guidance principles, we built a tethered tandem rotorcraft called Beerotor (80g), which flies along a high-roofed tunnel. The aerial robot adjusts its pitch and hence its speed, hugs the ground and lands safely without any need for an inertial reference frame. The rotorcraft's altitude and forward speed are adjusted via several op-tic flow feedback loops piloting respectively the lift and the pitch angle on the basis of the common-mode and differential rotor speeds, respectively as well as an active system of reorientation of a quasi-panoramic eye which constantly realigns its gaze, keeping it parallel to the nearest surface followed. Safe automatic terrain following and landing were obtained with the active eye-reorientation system over rugged terrain, without any need for an inertial reference frame

    Neuromimetic Robots inspired by Insect Vision

    Get PDF
    International audienceEquipped with a less-than-one-milligram brain, insects fly autonomously in complex environments without resorting to any Radars, Ladars, Sonars or GPS. The knowledge gained during the last decades on insects' sensory-motor abilities and the neuronal substrates involved provides us with a rich source of inspiration for designing tomorrow's self-guided vehicles and micro-vehicles, which are to cope with unforeseen events on the ground, in the air, under water or in space. Insects have been in the business of sensory-motor integration for several 100 millions years and can therefore teach us useful tricks for designing agile autonomous vehicles at various scales. Constructing a "biorobot" first requires exactly formulating the signal processing principles at work in the animal. It gives us, in return, a unique opportunity of checking the soundness and robustness of those principles by bringing them face to face with the real physical world. Here we describe some of the visually-guided terrestrial and aerial robots we have developed on the basis of our biological findings. These robots (Robot Fly, SCANIA, FANIA, OSCAR, OCTAVE and LORA) all react to the optic flow (i.e., the angular speed of the retinal image). Optic flow is sensed onboard the robots by miniature vision sensors called Elementary Motion Detectors (EMDs). The principle of these electro-optical velocity sensors was derived from optical/electrophysiological studies where we recorded the responses of single neurons to optical microstimulation of single photoreceptor cells in a model visual system: the fly's compound eye. Optic flow based sensors rely solely on contrast provided by reflected (or scattered) sunlight from any kind of celestial bodies in a given spectral range. These nonemissive, powerlean sensors offer potential applications to manned or unmanned aircraft. Applications can also be envisaged to spacecraft, from robotic landers and rovers to asteroid explorers or space station dockers, with interesting prospects as regards reduction in weight and consumption

    Controlling docking, altitude and speed in a circular high-roofed tunnel thanks to the optic flow

    No full text
    International audienceThe new robot called BeeRotor we have developed is a tandem rotorcraft that mimicks optic flow-based behaviors previously observed in flies and bees. This tethered miniature robot (80g), which is autonomous in terms of its computational power requirements, is equipped with a 13.5-g quasi-panoramic visual system consisting of 4 individual visual motion sensors responding to the optic flow generated by photographs of natural scenes, thanks to the bio-inspired "time of travel" scheme. Based on recent findings on insects' sensing abilities and control strategies, the BeeRotor robot was designed to use optic flow to perform complex tasks such as ground and ceiling following while also automatically driving its forward speed on the basis of the ventral or dorsal optic flow. In addition, the BeeRotor robot can perform tricky manoeuvers such as automatic ceiling docking by simply regulating its dorsal or ventral optic flow in high-roofed tunnel depicting natural scenes. Although it was built as a proof of concept, the BeeRotor robot is one step further towards achieving a fully- autonomous micro-helicopter which is capable of navigating mainly on the basis of the optic flow

    Bio-inspired Landing Approaches and Their Potential Use On Extraterrestrial Bodies

    No full text
    International audienceAutomatic landing on extraterrestrial bodies is still a challenging and hazardous task. Here we propose a new type of autopilot designed to solve landing problems, which is based on neurophysiological, behavioral, and biorobotic findings on flying insects. Flying insects excel in optic flow sensing techniques and cope with highly parallel data at a low energy and computational cost using lightweight dedicated motion processing circuits. In the first part of this paper, we present our biomimetic approach in the context of a lunar landing scenario, assuming a 2-degree-of-freedom spacecraft approaching the moon, which is simulated with the PANGU software. The autopilot we propose relies only on optic flow (OF) and inertial measurements, and aims at regulating the OF generated during the landing approach, by means of a feedback control system whose sensor is an OF sensor. We put forward an estimation method based on a two-sensor setup to accurately estimate the orientation of the lander's velocity vector, which is mandatory to control the lander's pitch in a near optimal way with respect to the fuel consumption. In the second part, we present a lightweight Visual Motion Sensor (VMS) which draws on the results of neurophysiological studies on the insect visual system. The VMS was able to perform local 1-D angular speed measurements in the range 1.5°/s - 25°/s. The sensor was mounted on an 80 kg unmanned helicopter and test-flown outdoors over various fields. The OF measured onboard was shown to match the ground-truth optic flow despite the dramatic disturbances and vibrations experienced by the sensor

    Optic Flow Based Autopilots: Speed Control and Obstacle Avoidance

    Get PDF
    International audienceThe explicit control schemes presented here explain how insects may navigate on the sole basis of optic flow (OF) cues without requiring any distance or speed measurements: how they take off and land, follow the terrain, avoid the lateral walls in a corridor and control their forward speed automatically. The optic flow regulator, a feedback system controlling either the lift, the forward thrust or the lateral thrust, is described. Three OF regulators account for various insect flight patterns observed over the ground and over still water, under calm and windy conditions and in straight and tapered corridors. These control schemes were simulated experimentally and/or implemented onboard two types of aerial robots, a micro helicopter (MH) and a hovercraft (HO), which behaved much like insects when placed in similar environments. These robots were equipped with opto-electronic OF sensors inspired by our electrophysiological findings on houseflies' motion sensitive visual neurons. The simple, parsimonious control schemes described here require no conventional avionic devices such as range finders, groundspeed sensors or GPS receivers. They are consistent with the the neural repertoire of flying insects and meet the low avionic payload requirements of autonomous micro aerial and space vehicles

    CQAR: Closed quarter aerial robot design for reconnaissance, surveillance and target acquisition tasks in urban areas

    Get PDF
    International Journal of Computational Intelligence, Volume 1, Number 4, 2004. Retrieved April 2006 from http://prism2.mem.drexel.edu/~paul/papers/ohIjci2004.pdfThis paper describes a prototype aircraft that can fly slowly, safely and transmit wireless video for tasks like reconnaissance, surveillance and target acquisition. The aircraft is designed to fly in closed quarters like forests, buildings, caves and tunnels which are often spacious but GPS reception is poor. Envisioned is that a small, safe and slow flying vehicle can assist in performing dull, dangerous and dirty tasks like disaster mitigation, search-and-rescue and structural damage assessment
    • …
    corecore