47 research outputs found

    The Vertical Optic Flow: An Additional Cue for Stabilizing Beerotor Robot’s Flight Without IMU

    No full text
    International audienceBio-inspired guidance principles involving no reference frame are presented here and were implemented in a rotorcraft called Beerotor, which was equipped with a minimalistic panoramic optic flow sensor and no accelerometer, no inertial measurement unit (IMU) [9], as in flying insects (Dipterian only uses rotation rates). In the present paper, the vertical optic flow was used as an additional cue whereas the previously published Beerotor II's visuo-motor system only used translational op-tic flow cues [9]. To test these guidance principles, we built a tethered tandem rotorcraft called Beerotor (80g), which flies along a high-roofed tunnel. The aerial robot adjusts its pitch and hence its speed, hugs the ground and lands safely without any need for an inertial reference frame. The rotorcraft's altitude and forward speed are adjusted via several op-tic flow feedback loops piloting respectively the lift and the pitch angle on the basis of the common-mode and differential rotor speeds, respectively as well as an active system of reorientation of a quasi-panoramic eye which constantly realigns its gaze, keeping it parallel to the nearest surface followed. Safe automatic terrain following and landing were obtained with the active eye-reorientation system over rugged terrain, without any need for an inertial reference frame

    Optic-flow-based steering and altitude control for ultra-light indoor aircraft

    Get PDF
    Our goal is to demonstrate the ability of bio-inspired techniques to solve the problem of piloting an autonomous 40-grams aircraft within textured indoor environments. Because of severe weight and energy constraints, inspiration has been taken from the fly and only visual and vestibular-like sensors are employed. This paper describes the models and algorithms that will be used for altitude control and frontal obstacle avoidance, mainly relying on optic flow. For experimental convenience, both mechanisms have first been implemented and tested on a small wheeled robot featuring the same hardware as the targeted aircraft

    Toward 30-gram Autonomous Indoor Aircraft: Vision-based Obstacle Avoidance and Altitude Control

    Get PDF
    We aim at developing autonomous micro-flyers capable of navigating within houses or small built environments. The severe weight and energy constraints of indoor flying platforms led us to take inspiration from flying insects for the selection of sensors, signal processing, and behaviors. This paper presents the control strategies enabling obstacle avoidance and altitude control using only optic flow and gyroscopic information. For experimental convenience, the control strategies are first implemented and tested separately on a small wheeled robot featuring the same hardware as the targeted aircraft. The obstacle avoidance system is then transferred to a 30-gram aircraft, which demonstrates autonomous steering within a square textured arena

    Taking Inspiration from Flying Insects to Navigate inside Buildings

    Get PDF
    These days, flying insects are seen as genuinely agile micro air vehicles fitted with smart sensors and also parsimonious in their use of brain resources. They are able to visually navigate in unpredictable and GPS-denied environments. Understanding how such tiny animals work would help engineers to figure out different issues relating to drone miniaturization and navigation inside buildings. To turn a drone of ~1 kg into a robot, miniaturized conventional avionics can be employed; however, this results in a loss of their flight autonomy. On the other hand, to turn a drone of a mass between ~1 g (or less) and ~500 g into a robot requires an innovative approach taking inspiration from flying insects both with regard to their flapping wing propulsion system and their sensory system based mainly on motion vision in order to avoid obstacles in three dimensions or to navigate on the basis of visual cues. This chapter will provide a snapshot of the current state of the art in the field of bioinspired optic flow sensors and optic flow-based direct feedback loops applied to micro air vehicles flying inside buildings

    Optic-flow-based Altitude and Forward Speed Regulation using Stereotypical Lateral Movements

    Get PDF
    We propose a novel optic-flow-based flight control strategy, inspired by recent observations and hypothesis by Baird (unpublished), to regulate independently forward speed and altitude. Unlike previous approaches where longitudinal ventral optic flow was used to regulate both forward speed and altitude, we suggest to use transversal ventral optic flow generated by a stereotyped lateral oscillation to regulate altitude. Longitudinal ventral optic flow is still used to regulate forward speed. The main advantage of this strategy is to allow any combination of forward speed and altitude, which is not possible by using exclusively longitudinal ventral optic flow. We show that this control strategy allows to control a helicopter- or insect-like simulated agent with any combination of forward speed and altitude. Moreover, thanks to a modulation of the open-loop oscillatory drive of the roll behaviour, this strategy achieves roll stabilisation

    Science, technology and the future of small autonomous drones

    Get PDF
    We are witnessing the advent of a new era of robots — drones — that can autonomously fly in natural and man-made environments. These robots, often associated with defence applications, could have a major impact on civilian tasks, including transportation, communication, agriculture, disaster mitigation and environment preservation. Autonomous flight in confined spaces presents great scientific and technical challenges owing to the energetic cost of staying airborne and to the perceptual intelligence required to negotiate complex environments. We identify scientific and technological advances that are expected to translate, within appropriate regulatory frameworks, into pervasive use of autonomous drones for civilian applications

    The roles of visual parallax and edge attraction in the foraging behaviour of the butterfly Papilio xuthus.

    Get PDF
    Several examples of insects using visual motion to measure distance have been documented, from locusts peering to gauge the proximity of prey, to honeybees performing visual odometry en route between the hive and a flower patch. However, whether the use of parallax information is confined to specialised behaviours like these or represents a more general purpose sensory capability, is an open question. We investigate this issue in the foraging swallowtail butterfly Papilio xuthus, which we trained to associate a target presented on a monitor with a food reward. We then tracked the animal\u27s flight in real-time, allowing us to manipulate the size and/or position of the target in a closed-loop manner to create the illusion that it is situated either above or below the monitor surface. Butterflies are less attracted to (i.e. slower to approach) targets that appear, based on motion parallax, to be more distant. Furthermore, we found that the number of abortive descent manoeuvres performed prior to the first successful target approach varies according to the depth of the virtual target, with expansion and parallax cues having effects of opposing polarity. However, we found no evidence that Papilio modulate the kinematic parameters of their descents according to the apparent distance of the target. Thus, we argue that motion parallax is used to identify a proximal target object, but that the subsequent process of approaching it is based on stabilising its edge in the 2D space of the retina, without estimating its distance

    Optical-Flow Based Strategies for Landing VTOL UAVs in Cluttered Environments

    Get PDF
    International audienceThis paper considers the question of landing an Unmanned Aerial Vehicles (UAV) using a single monocular camera as the primary exteroceptive sensing modality. The proposed control law is based on tracking a single point feature, representing the desired landing point on a ground plane, along with optical flow computed over the full image. The bearing of the desired landing point is used as a driving term to force convergence, while the optical flow is used to provide a damping force that guarantees both obstacle avoidance as well as damping the convergence of the vehicle to the ground plane ensuring a soft touchdown. A detailed analysis of the system closed-loop dynamics is undertaken and the response of the system is verified in simulation
    corecore