13 research outputs found

    A contribution to vision-based autonomous helicopter flight in urban environments

    Get PDF
    A navigation strategy that exploits the optic flow and inertial information to continuously avoid collisions with both lateral and frontal obstacles has been used to control a simulated helicopter flying autonomously in a textured urban environment. Experimental results demonstrate that the corresponding controller generates cautious behavior, whereby the helicopter tends to stay in the middle of narrow corridors, while its forward velocity is automatically reduced when the obstacle density increases. When confronted with a frontal obstacle, the controller is also able to generate a tight U-turn that ensures the UAV’s survival. The paper provides comparisons with related work, and discusses the applicability of the approach to real platforms

    Biomimetic visual navigation in a corridor: to centre or not to centre?

    Get PDF
    International audienceAs a first step toward an Automatic Flight Control System (AFCS) for Micro-Air Vehicle (MAV) obstacle avoidance, we introduce a vision based autopilot (LORA: Lateral Optic flow Regulation Autopilot), which is able to make a hovercraft automatically follow a wall or centre between the two walls of a corridor. A hovercraft is endowed with natural stabilization in pitch and roll while keeping two translational degrees of freedom (X and Y) and one rotational degree of freedom (yaw Ψ). We show the feasibility of an OF regulator that maintains the lateral Optic Flow (OF) on one wall equal to an OF set-point. The OF sensors used are Elementary Motion Detectors (EMDs), whose working was directly inspired by the housefly motion detecting neurons. The properties of these neurons were previously analysed at our laboratory by performing electrophysiological recordings while applying optical microstimuli to single photoreceptor cells of the compound eye. The simulation results show that depending on the OF set-point, the hovercraft either centres along the midline of the corridor or follows one of the two walls, even with local lack of optical texture on one wall, such as caused, for instance, by an open door or a T-junction. All these navigational tasks are performed with one and the same feedback loop, which consists of a lateral OF regulation loop that permits relatively high-speed navigation (1m/s, i.e 3 body-lengths per second). The passive visual sensors and the simple processing system are suitable for use with MAVs with an avionic payload of only a few grams. The goal is to achieve MAV automatic guidance or to relieve a remote operator from guiding it in challenging environments such as urban canyons or indoor environments

    Evolving Spiking Neurons from Wheels to Wings

    Get PDF
    We give an overview of the EPFL indoor flying project, whose goal is to evolve autonomous, adaptive, indoor micro-flyers. Indoor flight is still a challenge because it requires miniaturization, energy efficiency, and smart control. This ongoing project consists in developing an autonomous flying vision-based micro-robot, a bio-inspired controller composed of adaptive spiking neurons directly mapped into digital micro-controllers, and a method to evolve such a network without human intervention. This document describes the motivation and methodology used to reach our goal as well as the results of a number of experiments on vision-based wheeled and flying robots

    Evolving Vision-based Flying Robots

    Get PDF
    We describe a new experimental approach whereby an indoor flying robot evolves the ability to navigate in a textured room using only visual information and neuromorphic control. The architecture of a spiking neural circuit, which is connected to the vision system and to the motors, is genetically encoded and evolved on the physical robot without human intervention. The flying robot consists of a small wireless airship equipped with a linear camera and a set of sensors used to measure its performance. Evolved spiking circuits can manage to fly the robot around the room by exploiting a combination of visual features, robot morphology, and interaction dynamics

    Toward Indoor Flying Robots

    Get PDF
    Developing a research autonomous plane for flying in a laboratory space is a challenge that forces one to understand the specific aerodynamic, power and construction constraints. In order to obtain a very slow flight while maintaining a high maneuverability, ultra-light structures and adequate components are required. In this paper we analyze the wing, propeller and motor characteristics and propose a methodology to optimize the motor/gear/propeller system. The C4 model plane (50g, 1.5m/s) demonstrates the feasibility of such a laboratory flying test-bed

    Optic Flow Based Autopilots: Speed Control and Obstacle Avoidance

    Get PDF
    International audienceThe explicit control schemes presented here explain how insects may navigate on the sole basis of optic flow (OF) cues without requiring any distance or speed measurements: how they take off and land, follow the terrain, avoid the lateral walls in a corridor and control their forward speed automatically. The optic flow regulator, a feedback system controlling either the lift, the forward thrust or the lateral thrust, is described. Three OF regulators account for various insect flight patterns observed over the ground and over still water, under calm and windy conditions and in straight and tapered corridors. These control schemes were simulated experimentally and/or implemented onboard two types of aerial robots, a micro helicopter (MH) and a hovercraft (HO), which behaved much like insects when placed in similar environments. These robots were equipped with opto-electronic OF sensors inspired by our electrophysiological findings on houseflies' motion sensitive visual neurons. The simple, parsimonious control schemes described here require no conventional avionic devices such as range finders, groundspeed sensors or GPS receivers. They are consistent with the the neural repertoire of flying insects and meet the low avionic payload requirements of autonomous micro aerial and space vehicles

    FREQUENCY DOMAIN CHARACTERIZATION OF OPTIC FLOW AND VISION-BASED OCELLAR SENSING FOR ROTATIONAL MOTION

    Get PDF
    The structure of an animal’s eye is determined by the tasks it must perform. While vertebrates rely on their two eyes for all visual functions, insects have evolved a wide range of specialized visual organs to support behaviors such as prey capture, predator evasion, mate pursuit, flight stabilization, and navigation. Compound eyes and ocelli constitute the vision forming and sensing mechanisms of some flying insects. They provide signals useful for flight stabilization and navigation. In contrast to the well-studied compound eye, the ocelli, seen as the second visual system, sense fast luminance changes and allows for fast visual processing. Using a luminance-based sensor that mimics the insect ocelli and a camera-based motion detection system, a frequency-domain characterization of an ocellar sensor and optic flow (due to rotational motion) are analyzed. Inspired by the insect neurons that make use of signals from both vision sensing mechanisms, advantages, disadvantages and complementary properties of ocellar and optic flow estimates are discussed

    DE L'INSECTE AUX ROBOTS : OBSERVER, RECONSTRUIRE, INNOVER ET MIEUX COMPRENDRE

    Get PDF
    Les insectes ailés ont résolu des problèmes ardus tels que la stabilisation du vol, l’évitement d’obstacles en 3D, la poursuite de cibles, l’odométrie, l’atterrissage sans piste aménagée et l’atterrissage sur des cibles en mouvements, problèmes sur lesquels bute encore la robotique autonome contemporaine. Certains principes naturels, éprouvés depuis des millions d’années, peuvent aujourd’hui apporter à la Robotique des idées innovantes. Nous savons depuis 70 ans que les insectes ailés réagissent visuellement aux mouvements relatifs du sol causés par leur mouvement propre [Kennedy, 1939]. De façon surprenante, cet indice visuel naturel, plus récemment nommé “flux optique" [Gibson, 1950], n’a pas encore envahi le champ de l’aéronautique, alors même que les capteurs et les traitements mis en oeuvre par le système nerveux d’un insecte au service de son comportement visuo-moteur commencent à être clairement identifiés [Kennedy, 1951; Reichardt, 1969; Hausen, 1984; Pichon et al., 1989; Franceschini et al., 1989; Collett et al., 1993; Srinivasan et al., 1996, 2000;Serres et al., 2008b; Portelli et al., 2010a].Accorder une certaine autorité de vol à un micro-aéronef est une tâche particulièrement difficile, en particulier pendant le décollage, l’atterrissage, ou en présence de vent. Construire un aéronef de quelques grammes ou dizaines de grammes équipé d’un pilote automatique demande alors une démarche innovante. J’ai donc choisi une démarche bioinspirée résolument tournée vers les insectes ailés pour tenter de résoudre les problèmes inhérents au décollage, au contrôle de la vitesse, à l’évitement d’obstacles, à la réaction au vent, ou bien encore l’atterrissage grâce à lamesure du flux optique

    Insect-Inspired Visual Perception for Flight Control and Collision Avoidance

    Get PDF
    Flying robots are increasingly used for tasks such as aerial mapping, fast exploration, video footage and monitoring of buildings. Autonomous flight at low altitude in cluttered and unknown environments is an active research topic because it poses challenging perception and control problems. Traditional methods for collision-free navigation at low altitude require heavy resources to deal with the complexity of natural environments, something that limits the autonomy and the payload of flying robots. Flying insects, however, are able to navigate safely and efficiently using vision as the main sensory modality. Flying insects rely on low resolution, high refresh rate, and wide-angle compound eyes to extract angular image motion and move in unstructured environments. These strategies result in systems that are physically and computationally lighter than those often found in high-definition stereovision. Taking inspiration from insects offers great potential for building small flying robots capable of navigating in cluttered environments using lightweight vision sensors. In this thesis, we investigate insect perception of visual motion and insect vision based flight control in cluttered environments. We use the knowledge gained through the modelling of neural circuits and behavioural experiments to develop flying robots with insect-inspired control strategies for goal-oriented navigation in complex environments. We start by exploring insect perception of visual motion. We present a study that reconciles an apparent contradiction in the literature for insect visual control: current models developed to explain insect flight behaviour rely on the measurement of optic flow, however the most prominent neural model for visual motion extraction (the Elementary Motion Detector, or EMD) does not measure optic flow. We propose a model for unbiased optic flow estimation that relies on comparing the output of multiple EMDs pointed in varying viewing directions. Our model is of interest of both engineers and biologists because it is computationally more efficient than other optic flow estimation algorithms, and because it represents a biologically plausible model for optic flow extraction in insect neural systems. We then focus on insect flight control strategies in the presence of obstacles. By recording the trajectories of bumblebees (Bombus terrestris), and by comparing them to simulated flights, we show that bumblebees rely primarily on the frontal part of their field of view, and that they pool optic flow in two different manners for the control of flight speed and of lateral position. For the control of lateral position, our results suggest that bumblebees selectively react to the portions of the visual field where optic flow is the highest, which correspond to the closest obstacles. Finally, we tackle goal-oriented navigation with a novel algorithm that combines aspects of insect perception and flight control presented in this thesis -- like the detection of fastest moving objects in the frontal visual field -- with other aspects of insect flight known from the literature such as saccadic flight pattern. Through simulations, we demonstrate autonomous navigation in forest-like environments using only local optic flow information and assuming knowledge about the direction to the navigation goal
    corecore