377 research outputs found

    The Vertical Optic Flow: An Additional Cue for Stabilizing Beerotor Robot’s Flight Without IMU

    No full text
    International audienceBio-inspired guidance principles involving no reference frame are presented here and were implemented in a rotorcraft called Beerotor, which was equipped with a minimalistic panoramic optic flow sensor and no accelerometer, no inertial measurement unit (IMU) [9], as in flying insects (Dipterian only uses rotation rates). In the present paper, the vertical optic flow was used as an additional cue whereas the previously published Beerotor II's visuo-motor system only used translational op-tic flow cues [9]. To test these guidance principles, we built a tethered tandem rotorcraft called Beerotor (80g), which flies along a high-roofed tunnel. The aerial robot adjusts its pitch and hence its speed, hugs the ground and lands safely without any need for an inertial reference frame. The rotorcraft's altitude and forward speed are adjusted via several op-tic flow feedback loops piloting respectively the lift and the pitch angle on the basis of the common-mode and differential rotor speeds, respectively as well as an active system of reorientation of a quasi-panoramic eye which constantly realigns its gaze, keeping it parallel to the nearest surface followed. Safe automatic terrain following and landing were obtained with the active eye-reorientation system over rugged terrain, without any need for an inertial reference frame

    A simple algorithm for distance estimation without radar and stereo vision based on the bionic principle of bee eyes

    Get PDF
    Simple navigation algorithms are needed for small autonomous unmanned aerial vehicles (UAVs). These algorithms can be implemented in a small microprocessor with low power consumption. This will help to reduce the weight of the UAVs computing equipment and to increase the flight range. The proposed algorithm uses only the number of opaque channels (ommatidia in bees) through which a target can be seen by moving an observer from location 1 to 2 toward the target. The distance estimation is given relative to the distance between locations 1 and 2. The simple scheme of an appositional compound eye to develop calculation formula is proposed. The distance estimation error analysis shows that it decreases with an increase of the total number of opaque channels to a certain limit. An acceptable error of about 2 % is achieved with the angle of view from 3 to 10° when the total number of opaque channels is 21600

    Insect inspired visual motion sensing and flying robots

    Get PDF
    International audienceFlying insects excellently master visual motion sensing techniques. They use dedicated motion processing circuits at a low energy and computational costs. Thanks to observations obtained on insect visual guidance, we developed visual motion sensors and bio-inspired autopilots dedicated to flying robots. Optic flow-based visuomotor control systems have been implemented on an increasingly large number of sighted autonomous robots. In this chapter, we present how we designed and constructed local motion sensors and how we implemented bio-inspired visual guidance scheme on-board several micro-aerial vehicles. An hyperacurate sensor in which retinal micro-scanning movements are performed via a small piezo-bender actuator was mounted onto a miniature aerial robot. The OSCAR II robot is able to track a moving target accurately by exploiting the microscan-ning movement imposed to its eye's retina. We also present two interdependent control schemes driving the eye in robot angular position and the robot's body angular position with respect to a visual target but without any knowledge of the robot's orientation in the global frame. This "steering-by-gazing" control strategy, which is implemented on this lightweight (100 g) miniature sighted aerial robot, demonstrates the effectiveness of this biomimetic visual/inertial heading control strategy

    Vision systems for autonomous aircraft guidance

    Get PDF

    Vision-based control of near-obstacle flight

    Get PDF
    This paper presents a novel control strategy, which we call optiPilot, for autonomous flight in the vicinity of obstacles. Most existing autopilots rely on a complete 6-degree-of-freedom state estimation using a GPS and an Inertial Measurement Unit (IMU) and are unable to detect and avoid obstacles. This is a limitation for missions such as surveillance and environment monitoring that may require near-obstacle flight in urban areas or mountainous environments. OptiPilot instead uses optic flow to estimate proximity of obstacles and avoid them. Our approach takes advantage of the fact that, for most platforms in translational flight (as opposed to near-hover flight), the translatory motion is essentially aligned with the aircraft main axis. This property allows us to directly interpret optic flow measurements as proximity indications. We take inspiration from neural and behavioural strategies of flying insects to propose a simple mapping of optic flow measurements into control signals that requires only a lightweight and power-efficient sensor suite and minimal processing power. In this paper, we first describe results obtained in simulation before presenting the implementation of optiPilot on a real flying platform equipped only with lightweight and inexpensive optic computer mouse sensors, MEMS rate gyroscopes and a pressure-based airspeed sensor. We show that the proposed control strategy not only allows collision-free flight in the vicinity of obstacles, but is also able to stabilise both attitude and altitude over flat terrain. These results shed new light on flight control by suggesting that the complex sensors and processing required for 6 degree-of-freedom state estimation may not be necessary for autonomous flight and pave the way toward the integration of autonomy into current and upcoming gram-scale flying platform
    corecore