30 research outputs found

    A 10-gram Vision-based Flying Robot

    Get PDF
    We aim at developing ultralight autonomous microflyers capable of freely flying within houses or small built environments while avoiding collisions. Our latest prototype is a fixed-wing aircraft weighing a mere 10 g, flying around 1.5 m/s and carrying the necessary electronics for airspeed regulation and lateral collision avoidance. This microflyer is equipped with two tiny camera modules, two rate gyroscopes, an anemometer, a small microcontroller, and a Bluetooth radio module. Inflight tests are carried out in a new experimentation room specifically designed for easy changing of surrounding textures

    Fast, Accurate Thin-Structure Obstacle Detection for Autonomous Mobile Robots

    Full text link
    Safety is paramount for mobile robotic platforms such as self-driving cars and unmanned aerial vehicles. This work is devoted to a task that is indispensable for safety yet was largely overlooked in the past -- detecting obstacles that are of very thin structures, such as wires, cables and tree branches. This is a challenging problem, as thin objects can be problematic for active sensors such as lidar and sonar and even for stereo cameras. In this work, we propose to use video sequences for thin obstacle detection. We represent obstacles with edges in the video frames, and reconstruct them in 3D using efficient edge-based visual odometry techniques. We provide both a monocular camera solution and a stereo camera solution. The former incorporates Inertial Measurement Unit (IMU) data to solve scale ambiguity, while the latter enjoys a novel, purely vision-based solution. Experiments demonstrated that the proposed methods are fast and able to detect thin obstacles robustly and accurately under various conditions.Comment: Appeared at IEEE CVPR 2017 Workshop on Embedded Visio

    3D Navigation With An Insect-Inspired Autopilot

    No full text
    ISBN : 978-2-9532965-0-1Using computer-simulation experiments, we developed a vision-based autopilot that enables a ‘simulated bee' to travel along a tunnel by controlling both its speed and its clearance from the right wall, the left wall, the ground, and the ceiling. The flying agent can translate along three directions (surge, sway, and heave): the agent is therefore fully actuated. The visuo-motor control system, called ALIS (AutopiLot using an Insect based vision System), is a dual OF regulator consisting of two interdependent feedback loops, each of which has its own OF set-point. The experiments show that the simulated bee navigates safely along a straight tunnel, while reacting sensibly to the major OF perturbation caused by the presence of a tapered tunnel. The visual system is minimalistic (only eight pixels) and it suffices to control the clearance from the four walls and the forward speed jointly, without the need to measure any speeds and distances. The OF sensors and the simple visuo-motor control system developed here are suitable for use on MAVs with avionic payloads as small as a few grams. Besides, the ALIS autopilot accounts remarkably for the quantitative results of ethological experiments performed on honeybees flying freely in straight or tapered corridors

    Obstacle Detection and Avoidance System Based on Monocular Camera and Size Expansion Algorithm for UAVs

    Get PDF
    One of the most challenging problems in the domain of autonomous aerial vehicles is the designing of a robust real-time obstacle detection and avoidance system. This problem is complex, especially for the micro and small aerial vehicles, that is due to the Size, Weight and Power (SWaP) constraints. Therefore, using lightweight sensors (i.e., Digital camera) can be the best choice comparing with other sensors; such as laser or radar. For real-time applications, different works are based on stereo cameras in order to obtain a 3D model of the obstacles, or to estimate their depth. Instead, in this paper, a method that mimics the human behavior of detecting the collision state of the approaching obstacles using monocular camera is proposed. The key of the proposed algorithm is to analyze the size changes of the detected feature points, combined with the expansion ratios of the convex hull constructed around the detected feature points from consecutive frames. During the Aerial Vehicle (UAV) motion, the detection algorithm estimates the changes in the size of the area of the approaching obstacles. First, the method detects the feature points of the obstacles, then extracts the obstacles that have the probability of getting close toward the UAV. Secondly, by comparing the area ratio of the obstacle and the position of the UAV, the method decides if the detected obstacle may cause a collision. Finally, by estimating the obstacle 2D position in the image and combining with the tracked waypoints, the UAV performs the avoidance maneuver. The proposed algorithm was evaluated by performing real indoor and outdoor flights, and the obtained results show the accuracy of the proposed algorithm compared with other related works.Research supported by the Spanish Government through the Cicyt project ADAS ROAD-EYE (TRA2013-48314-C3-1-R)

    Detecting and avoiding frontal obstacles from monocular camera for micro unmanned aerial vehicles

    Full text link
    In literature, several approaches are trying to make the UAVs fly autonomously i.e., by extracting perspective cues such as straight lines. However, it is only available in well-defined human made environments, in addition to many other cues which require enough texture information. Our main target is to detect and avoid frontal obstacles from a monocular camera using a quad rotor Ar.Drone 2 by exploiting optical flow as a motion parallax, the drone is permitted to fly at a speed of 1 m/s and an altitude ranging from 1 to 4 meters above the ground level. In general, detecting and avoiding frontal obstacle is a quite challenging problem because optical flow has some limitation which should be taken into account i.e. lighting conditions and aperture problem

    Obstacle avoidance based-visual navigation for micro aerial vehicles

    Get PDF
    This paper describes an obstacle avoidance system for low-cost Unmanned Aerial Vehicles (UAVs) using vision as the principal source of information through the monocular onboard camera. For detecting obstacles, the proposed system compares the image obtained in real time from the UAV with a database of obstacles that must be avoided. In our proposal, we include the feature point detector Speeded Up Robust Features (SURF) for fast obstacle detection and a control law to avoid them. Furthermore, our research includes a path recovery algorithm. Our method is attractive for compact MAVs in which other sensors will not be implemented. The system was tested in real time on a Micro Aerial Vehicle (MAV), to detect and avoid obstacles in an unknown controlled environment; we compared our approach with related works.Peer ReviewedPostprint (published version

    A 3D insect-inspired visual autopilot for corridor-following

    Get PDF
    International audienceMotivated by the results of behavioral studies performed on bees over the last two decades, we have attempted to decipher the logics behind the bee's autopilot, with specific reference to their use of optic flow (OF). Using computer-simulation experiments, we developed a vision-based autopilot that enables a 'simulated bee' to travel along a tunnel by controlling both its speed and its clearance from the walls, the ground, and the ceiling. The flying agent is fully actuated and can translate along three directions: surge, sway, and heave. The visuo-motor control system, called ALIS (AutopiLot using an Insect based vision System), is a dual OF regulator consisting of intertwined feedback loops, each of which has its own OF set-point. The experiments show that the simulated bee navigates safely along a straight or tapered tunnel and reacts sensibly to major OF perturbations caused, e.g., by the lack of texture on one wall or by the presence of a tapered tunnel. The agent is equipped with a minimalistic visual system (comprised of only eight pixels) that suffices to control the clearance from the four walls and the forward speed jointly, without the need to measure any speeds and distances. The OF sensors and the simple visuo-motor control system developed here are suitable for use on MAVs with avionic payloads as small as a few grams. Besides, the ALIS autopilot accounts remarkably for the quantitative results of ethological experiments performed on honeybees flying freely in straight or tapered corridors

    Flying over the reality gap: From simulated to real indoor airships

    Get PDF
    Because of their ability to naturally float in the air, indoor airships (often called blimps) constitute an appealing platform for research in aerial robotics. However, when confronted to long lasting experiments such as those involving learning or evolutionary techniques, blimps present the disadvantage that they cannot be linked to external power sources and tend to have little mechanical resistance due to their low weight budget. One solution to this problem is to use a realistic flight simulator, which can also significantly reduce experimental duration by running faster than real time. This requires an efficient physical dynamic modelling and parameter identification procedure, which are complicated to develop and usually rely on costly facilities such as wind tunnels. In this paper, we present a simple and efficient physics-based dynamic modelling of indoor airships including a pragmatic methodology for parameter identification without the need for complex or costly test facilities. Our approach is tested with an existing blimp in a vision-based navigation task. Neuronal controllers are evolved in simulation to map visual input into motor commands in order to steer the flying robot forward as fast as possible while avoiding collisions. After evolution, the best individuals are successfully transferred to the physical blimp, which experimentally demonstrates the efficiency of the proposed approac
    corecore