23,303 research outputs found

    Perceptual aliasing in vision-based robot navigation

    Get PDF
    In order to create intelligent robots that are able to react to their environment through computer vision, it has been of interest to study how humans, and animals receive and process visual information. Flying animals, such as birds and bats, use a vision processing technique called optical flow to navigate the environment. The key to making use of optical flow for feedback control is the idea of time-to-transit, which is a measure of how long it will take an observer to pass an object in its field of view. Simply using optical flow data, this time-to-transit (tau) can be calculated without knowing the distance to the object, or the size of the object itself. Tau can be computed in real time and used as input to autonomous vehicle control laws. Vision-based navigation of autonomous robotic vehicles can support applications in both the military and civilian sectors. In this work, a series of feedback control laws for autonomous robot control, whose inputs are the frames of a video sequence in real time, are developed. Two control laws, coined motion primitives, are developed based on tau balancing and tau difference maximizing, and protocol switching logic is established to determine when each should be employed. The tau balancing law utilizes information on both the right and left sides of the path environment, when available, and attempts to balance between them. The tau difference maximizing primitive, contrastingly, aligns the vehicle motion with features either on one side or the other. A tertiary navigation strategy is also implemented where the segments of sensing, perceiving, and acting are separated. A simulation environment is also developed as a test-bed for studying the effects of changing control law parameters and decision variables for protocol switches. In some cases, it may appear as though one strategy can be used, when the other is actually required. Such situations are referred to as occurrences of perceptual aliasing - the misinterpretation of perceptual cues, leading to the execution of an unsuitable action. Such misunderstanding of the environment can lead to dangerous motions of the vehicle - as would occur when the control attempts to steer the vehicle between features on the left and right sides of a solid obstacle or wall in the vehicle's path. Without safeguards in place to prevent this misinterpretation, perceptual aliasing could cause a robot to collide with obstacles in its environment. Perceptual aliasing can occur whenever the most intuitive control strategy will not result in successful navigation. The problem is overcome through studies of human and animal perception, as well as a statistical analysis of the structure of optical flow and time-to-transit, to intelligently select which control strategy to implement. These control laws are composed together to allow a robot to autonomously navigate a corridor environment with both straight and turning sections

    Bioinspired engineering of exploration systems for NASA and DoD

    Get PDF
    A new approach called bioinspired engineering of exploration systems (BEES) and its value for solving pressing NASA and DoD needs are described. Insects (for example honeybees and dragonflies) cope remarkably well with their world, despite possessing a brain containing less than 0.01% as many neurons as the human brain. Although most insects have immobile eyes with fixed focus optics and lack stereo vision, they use a number of ingenious, computationally simple strategies for perceiving their world in three dimensions and navigating successfully within it. We are distilling selected insect-inspired strategies to obtain novel solutions for navigation, hazard avoidance, altitude hold, stable flight, terrain following, and gentle deployment of payload. Such functionality provides potential solutions for future autonomous robotic space and planetary explorers. A BEES approach to developing lightweight low-power autonomous flight systems should be useful for flight control of such biomorphic flyers for both NASA and DoD needs. Recent biological studies of mammalian retinas confirm that representations of multiple features of the visual world are systematically parsed and processed in parallel. Features are mapped to a stack of cellular strata within the retina. Each of these representations can be efficiently modeled in semiconductor cellular nonlinear network (CNN) chips. We describe recent breakthroughs in exploring the feasibility of the unique blending of insect strategies of navigation with mammalian visual search, pattern recognition, and image understanding into hybrid biomorphic flyers for future planetary and terrestrial applications. We describe a few future mission scenarios for Mars exploration, uniquely enabled by these newly developed biomorphic flyers

    Detection and estimation of moving obstacles for a UAV

    Get PDF
    In recent years, research interest in Unmanned Aerial Vehicles (UAVs) has been grown rapidly because of their potential use for a wide range of applications. In this paper, we proposed a vision-based detection and position/velocity estimation of moving obstacle for a UAV. The knowledge of a moving obstacle's state, i.e., position, velocity, is essential to achieve better performance for an intelligent UAV system specially in autonomous navigation and landing tasks. The novelties are: (1) the design and implementation of a localization method using sensor fusion methodology which fuses Inertial Measurement Unit (IMU) signals and Pozyx signals; (2) The development of detection and estimation of moving obstacles method based on on-board vision system. Experimental results validate the effectiveness of the proposed approach. (C) 2019, IFAC (International Federation of Automatic Control) Hosting by Elsevier Ltd. All rights reserved

    Correlation Flow: Robust Optical Flow Using Kernel Cross-Correlators

    Full text link
    Robust velocity and position estimation is crucial for autonomous robot navigation. The optical flow based methods for autonomous navigation have been receiving increasing attentions in tandem with the development of micro unmanned aerial vehicles. This paper proposes a kernel cross-correlator (KCC) based algorithm to determine optical flow using a monocular camera, which is named as correlation flow (CF). Correlation flow is able to provide reliable and accurate velocity estimation and is robust to motion blur. In addition, it can also estimate the altitude velocity and yaw rate, which are not available by traditional methods. Autonomous flight tests on a quadcopter show that correlation flow can provide robust trajectory estimation with very low processing power. The source codes are released based on the ROS framework.Comment: 2018 International Conference on Robotics and Automation (ICRA 2018
    • …
    corecore