139 research outputs found

    Biologically Inspired Visual Control of Flying Robots

    Get PDF
    Insects posses an incredible ability to navigate their environment at high speed, despite having small brains and limited visual acuity. Through selective pressure they have evolved computationally efficient means for simultaneously performing navigation tasks and instantaneous control responses. The insect’s main source of information is visual, and through a hierarchy of processes this information is used for perception; at the lowest level are local neurons for detecting image motion and edges, at the higher level are interneurons to spatially integrate the output of previous stages. These higher level processes could be considered as models of the insect's environment, reducing the amount of information to only that which evolution has determined relevant. The scope of this thesis is experimenting with biologically inspired visual control of flying robots through information processing, models of the environment, and flight behaviour. In order to test these ideas I developed a custom quadrotor robot and experimental platform; the 'wasp' system. All algorithms ran on the robot, in real-time or better, and hypotheses were always verified with flight experiments. I developed a new optical flow algorithm that is computationally efficient, and able to be applied in a regular pattern to the image. This technique is used later in my work when considering patterns in the image motion field. Using optical flow in the log-polar coordinate system I developed attitude estimation and time-to-contact algorithms. I find that the log-polar domain is useful for analysing global image motion; and in many ways equivalent to the retinotopic arrange- ment of neurons in the optic lobe of insects, used for the same task. I investigated the role of depth in insect flight using two experiments. In the first experiment, to study how concurrent visual control processes might be combined, I developed a control system using the combined output of two algorithms. The first algorithm was a wide-field optical flow balance strategy and the second an obstacle avoidance strategy which used inertial information to estimate the depth to objects in the environment - objects whose depth was significantly different to their surround- ings. In the second experiment I created an altitude control system which used a model of the environment in the Hough space, and a biologically inspired sampling strategy, to efficiently detect the ground. Both control systems were used to control the flight of a quadrotor in an indoor environment. The methods that insects use to perceive edges and control their flight in response had not been applied to artificial systems before. I developed a quadrotor control system that used the distribution of edges in the environment to regulate the robot height and avoid obstacles. I also developed a model that predicted the distribution of edges in a static scene, and using this prediction was able to estimate the quadrotor altitude

    Insect inspired visual motion sensing and flying robots

    Get PDF
    International audienceFlying insects excellently master visual motion sensing techniques. They use dedicated motion processing circuits at a low energy and computational costs. Thanks to observations obtained on insect visual guidance, we developed visual motion sensors and bio-inspired autopilots dedicated to flying robots. Optic flow-based visuomotor control systems have been implemented on an increasingly large number of sighted autonomous robots. In this chapter, we present how we designed and constructed local motion sensors and how we implemented bio-inspired visual guidance scheme on-board several micro-aerial vehicles. An hyperacurate sensor in which retinal micro-scanning movements are performed via a small piezo-bender actuator was mounted onto a miniature aerial robot. The OSCAR II robot is able to track a moving target accurately by exploiting the microscan-ning movement imposed to its eye's retina. We also present two interdependent control schemes driving the eye in robot angular position and the robot's body angular position with respect to a visual target but without any knowledge of the robot's orientation in the global frame. This "steering-by-gazing" control strategy, which is implemented on this lightweight (100 g) miniature sighted aerial robot, demonstrates the effectiveness of this biomimetic visual/inertial heading control strategy

    Taking Inspiration from Flying Insects to Navigate inside Buildings

    Get PDF
    These days, flying insects are seen as genuinely agile micro air vehicles fitted with smart sensors and also parsimonious in their use of brain resources. They are able to visually navigate in unpredictable and GPS-denied environments. Understanding how such tiny animals work would help engineers to figure out different issues relating to drone miniaturization and navigation inside buildings. To turn a drone of ~1 kg into a robot, miniaturized conventional avionics can be employed; however, this results in a loss of their flight autonomy. On the other hand, to turn a drone of a mass between ~1 g (or less) and ~500 g into a robot requires an innovative approach taking inspiration from flying insects both with regard to their flapping wing propulsion system and their sensory system based mainly on motion vision in order to avoid obstacles in three dimensions or to navigate on the basis of visual cues. This chapter will provide a snapshot of the current state of the art in the field of bioinspired optic flow sensors and optic flow-based direct feedback loops applied to micro air vehicles flying inside buildings

    Aerial Vehicles

    Get PDF
    This book contains 35 chapters written by experts in developing techniques for making aerial vehicles more intelligent, more reliable, more flexible in use, and safer in operation.It will also serve as an inspiration for further improvement of the design and application of aeral vehicles. The advanced techniques and research described here may also be applicable to other high-tech areas such as robotics, avionics, vetronics, and space

    Bio-Inspired Hovering Control for an Aerial Robot Equipped with a Decoupled Eye and a Rate Gyro

    No full text
    International audienceThis work provides an hovering control strategy for a sighted robot, the eye of which being decoupled from the body and controlled by means of a tiny rotative piezo motor. The main purpose of this paper is to show the effectiveness and the efficiency of this fundamental bio-inspired mechanical decoupling. Indeed, it exhibits several benefits: * it enables to stabilize the robot's gaze on the basis of three bio-inspired oculomotor reflexes (ORs) : a visual fixation reflex (VFR), a translational and rotational vestibulo- ocular reflexes (tVOR and rVOR), * the eye can better, quickly and accurately compensate for sudden, untoward disturbances caused by the vagaries of the supporting head or body, * it yields a reference visual signal that can be used to unbias the rate gyro used to implement the VORs and to stabilize the hovering robot, * it increases the tracking accuracy with moving targets compared to without OR, This paper shows also that lateral disturbances are rejected 2 times faster with the decoupled eye robot, and roll perturbations induce a retinal error 20 times smaller. The occulomotor reflexes enables to cancel retinal error 6 times faster with 5 times lower retinal error picks. The conclusion of the paper is that decoupled eye must be considered as an efficient autonomous flight solution

    Feature Papers of Drones - Volume I

    Get PDF
    [EN] The present book is divided into two volumes (Volume I: articles 1–23, and Volume II: articles 24–54) which compile the articles and communications submitted to the Topical Collection ”Feature Papers of Drones” during the years 2020 to 2022 describing novel or new cutting-edge designs, developments, and/or applications of unmanned vehicles (drones). Articles 1–8 are devoted to the developments of drone design, where new concepts and modeling strategies as well as effective designs that improve drone stability and autonomy are introduced. Articles 9–16 focus on the communication aspects of drones as effective strategies for smooth deployment and efficient functioning are required. Therefore, several developments that aim to optimize performance and security are presented. In this regard, one of the most directly related topics is drone swarms, not only in terms of communication but also human-swarm interaction and their applications for science missions, surveillance, and disaster rescue operations. To conclude with the volume I related to drone improvements, articles 17–23 discusses the advancements associated with autonomous navigation, obstacle avoidance, and enhanced flight plannin

    Optic Flow for Obstacle Avoidance and Navigation: A Practical Approach

    Full text link
    This thesis offers contributions and innovations to the development of vision-based autonomous flight control systems for small unmanned aerial vehicles operating in cluttered urban environments. Although many optic flow algorithms have been reported, almost none have addressed the critical issue of accuracy and reliability over a wide dynamic range of optic flow. My aim is to rigorously develop improved optic flow sensing to meet realistic mission requirements for autonomous navigation and collision avoidance. A review of related work enabled development of a new hybrid optic flow algorithm concept combining the best properties of image correlation and interpolation with additional innovations to enhance accuracy, computational speed and reliability. Key analytical work yielded a methodology for determining optic flow dynamic range requirements from system and sensor design parameters and a technique enabling a video sensor to operate as a passive ranging system for closed loop flight control. Detailed testing led to development of the hybrid image interpolation algorithm (HI2A) using improved correlation search strategies, sparse images to reduce processing loads, a solution tracking loop to bypass the more intensive initial estimation process, a frame look-back method to improve accuracy at low optic flow, a modified interpolation technique to improve robustness and an extensive error checking system for validating outputs. A realistic simulation system was developed incorporating independent, precision ground truthing to assess algorithm accuracy. Comparison testing of the HI2A against the commonly-used Lucas Kanade algorithm demonstrates major improvement in accuracy over greatly expanded dynamic range. A reactive flight controller using ranging data from a monocular, forward looking video sensor and rules-based logic was developed and tested in Monte Carlo simulations of a hundred flights. At higher flight speeds than reported in similar tests, collision-free results were obtained in a realistic urban canyon environment. The HI2A algorithm and flight controller software performance on a common PC was up to eight times faster than real-time for outputs of 250 measurements at 50 Hz. The feasibility of terrain mapping in real-time was demonstrated using 3D ranging data from optic flow in an overflight of the urban simulation environment indicating the potential for its use in path planning approaches to navigation and collision avoidance

    Bio-Inspired Information Extraction In 3-D Environments Using Wide-Field Integration Of Optic Flow

    Get PDF
    A control theoretic framework is introduced to analyze an information extraction approach from patterns of optic flow based on analogues to wide-field motion-sensitive interneurons in the insect visuomotor system. An algebraic model of optic flow is developed, based on a parameterization of simple 3-D environments. It is shown that estimates of proximity and speed, relative to these environments, can be extracted using weighted summations of the instantaneous patterns of optic flow. Small perturbation techniques are utilized to link weighting patterns to outputs, which are applied as feedback to facilitate stability augmentation and perform local obstacle avoidance and terrain following. Weighting patterns that provide direct linear mappings between the sensor array and actuator commands can be derived by casting the problem as a combined static state estimation and linear feedback control problem. Additive noise and environment uncertainties are incorporated into an offline procedure for determination of optimal weighting patterns. Several applications of the method are provided, with differing spatial measurement domains. Non-linear stability analysis and experimental demonstration is presented for a wheeled robot measuring optic flow in a planar ring. Local stability analysis and simulation is used to show robustness over a range of urban-like environments for a fixed-wing UAV measuring in orthogonal rings and a micro helicopter measuring over the full spherical viewing arena. Finally, the framework is used to analyze insect tangential cells with respect to the information they encode and to demonstrate how cell outputs can be appropriately amplified and combined to generate motor commands to achieve reflexive navigation behavior

    Combining omnidirectional vision with polarization vision for robot navigation

    Get PDF
    La polarisation est le phénomène qui décrit les orientations des oscillations des ondes lumineuses qui sont limitées en direction. La lumière polarisée est largement utilisée dans le règne animal,à partir de la recherche de nourriture, la défense et la communication et la navigation. Le chapitre (1) aborde brièvement certains aspects importants de la polarisation et explique notre problématique de recherche. Nous visons à utiliser un capteur polarimétrique-catadioptrique car il existe de nombreuses applications qui peuvent bénéficier d'une telle combinaison en vision par ordinateur et en robotique, en particulier pour l'estimation d'attitude et les applications de navigation. Le chapitre (2) couvre essentiellement l'état de l'art de l'estimation d'attitude basée sur la vision.Quand la lumière non-polarisée du soleil pénètre dans l'atmosphère, l'air entraine une diffusion de Rayleigh, et la lumière devient partiellement linéairement polarisée. Le chapitre (3) présente les motifs de polarisation de la lumière naturelle et couvre l'état de l'art des méthodes d'acquisition des motifs de polarisation de la lumière naturelle utilisant des capteurs omnidirectionnels (par exemple fisheye et capteurs catadioptriques). Nous expliquons également les caractéristiques de polarisation de la lumière naturelle et donnons une nouvelle dérivation théorique de son angle de polarisation.Notre objectif est d'obtenir une vue omnidirectionnelle à 360 associée aux caractéristiques de polarisation. Pour ce faire, ce travail est basé sur des capteurs catadioptriques qui sont composées de surfaces réfléchissantes et de lentilles. Généralement, la surface réfléchissante est métallique et donc l'état de polarisation de la lumière incidente, qui est le plus souvent partiellement linéairement polarisée, est modifiée pour être polarisée elliptiquement après réflexion. A partir de la mesure de l'état de polarisation de la lumière réfléchie, nous voulons obtenir l'état de polarisation incident. Le chapitre (4) propose une nouvelle méthode pour mesurer les paramètres de polarisation de la lumière en utilisant un capteur catadioptrique. La possibilité de mesurer le vecteur de Stokes du rayon incident est démontré à partir de trois composants du vecteur de Stokes du rayon réfléchi sur les quatre existants.Lorsque les motifs de polarisation incidents sont disponibles, les angles zénithal et azimutal du soleil peuvent être directement estimés à l'aide de ces modèles. Le chapitre (5) traite de l'orientation et de la navigation de robot basées sur la polarisation et différents algorithmes sont proposés pour estimer ces angles dans ce chapitre. A notre connaissance, l'angle zénithal du soleil est pour la première fois estimé dans ce travail à partir des schémas de polarisation incidents. Nous proposons également d'estimer l'orientation d'un véhicule à partir de ces motifs de polarisation.Enfin, le travail est conclu et les possibles perspectives de recherche sont discutées dans le chapitre (6). D'autres exemples de schémas de polarisation de la lumière naturelle, leur calibrage et des applications sont proposées en annexe (B).Notre travail pourrait ouvrir un accès au monde de la vision polarimétrique omnidirectionnelle en plus des approches conventionnelles. Cela inclut l'orientation bio-inspirée des robots, des applications de navigation, ou bien la localisation en plein air pour laquelle les motifs de polarisation de la lumière naturelle associés à l'orientation du soleil à une heure précise peuvent aboutir à la localisation géographique d'un véhiculePolarization is the phenomenon that describes the oscillations orientations of the light waves which are restricted in direction. Polarized light has multiple uses in the animal kingdom ranging from foraging, defense and communication to orientation and navigation. Chapter (1) briefly covers some important aspects of polarization and explains our research problem. We are aiming to use a polarimetric-catadioptric sensor since there are many applications which can benefit from such combination in computer vision and robotics specially robot orientation (attitude estimation) and navigation applications. Chapter (2) mainly covers the state of art of visual based attitude estimation.As the unpolarized sunlight enters the Earth s atmosphere, it is Rayleigh-scattered by air, and it becomes partially linearly polarized. This skylight polarization provides a signi cant clue to understanding the environment. Its state conveys the information for obtaining the sun orientation. Robot navigation, sensor planning, and many other applications may bene t from using this navigation clue. Chapter (3) covers the state of art in capturing the skylight polarization patterns using omnidirectional sensors (e.g fisheye and catadioptric sensors). It also explains the skylight polarization characteristics and gives a new theoretical derivation of the skylight angle of polarization pattern. Our aim is to obtain an omnidirectional 360 view combined with polarization characteristics. Hence, this work is based on catadioptric sensors which are composed of reflective surfaces and lenses. Usually the reflective surface is metallic and hence the incident skylight polarization state, which is mostly partially linearly polarized, is changed to be elliptically polarized after reflection. Given the measured reflected polarization state, we want to obtain the incident polarization state. Chapter (4) proposes a method to measure the light polarization parameters using a catadioptric sensor. The possibility to measure the incident Stokes is proved given three Stokes out of the four reflected Stokes. Once the incident polarization patterns are available, the solar angles can be directly estimated using these patterns. Chapter (5) discusses polarization based robot orientation and navigation and proposes new algorithms to estimate these solar angles where, to the best of our knowledge, the sun zenith angle is firstly estimated in this work given these incident polarization patterns. We also propose to estimate any vehicle orientation given these polarization patterns. Finally the work is concluded and possible future research directions are discussed in chapter (6). More examples of skylight polarization patterns, their calibration, and the proposed applications are given in appendix (B). Our work may pave the way to move from the conventional polarization vision world to the omnidirectional one. It enables bio-inspired robot orientation and navigation applications and possible outdoor localization based on the skylight polarization patterns where given the solar angles at a certain date and instant of time may infer the current vehicle geographical location.DIJON-BU Doc.électronique (212319901) / SudocSudocFranceF
    corecore