3,240 research outputs found

    Electronic Image Stabilization for Mobile Robotic Vision Systems

    Get PDF
    When a camera is affixed on a dynamic mobile robot, image stabilization is the first step towards more complex analysis on the video feed. This thesis presents a novel electronic image stabilization (EIS) algorithm for small inexpensive highly dynamic mobile robotic platforms with onboard camera systems. The algorithm combines optical flow motion parameter estimation with angular rate data provided by a strapdown inertial measurement unit (IMU). A discrete Kalman filter in feedforward configuration is used for optimal fusion of the two data sources. Performance evaluations are conducted by a simulated video truth model (capturing the effects of image translation, rotation, blurring, and moving objects), and live test data. Live data was collected from a camera and IMU affixed to the DAGSI Whegsℱ mobile robotic platform as it navigated through a hallway. Template matching, feature detection, optical flow, and inertial measurement techniques are compared and analyzed to determine the most suitable algorithm for this specific type of image stabilization. Pyramidal Lucas-Kanade optical flow using Shi-Tomasi good features in combination with inertial measurement is the EIS algorithm found to be superior. In the presence of moving objects, fusion of inertial measurement reduces optical flow root-mean-squared (RMS) error in motion parameter estimates by 40%. No previous image stabilization algorithm to date directly fuses optical flow estimation with inertial measurement by way of Kalman filtering

    GyroFlow: Gyroscope-Guided Unsupervised Optical Flow Learning

    Full text link
    Existing optical flow methods are erroneous in challenging scenes, such as fog, rain, and night because the basic optical flow assumptions such as brightness and gradient constancy are broken. To address this problem, we present an unsupervised learning approach that fuses gyroscope into optical flow learning. Specifically, we first convert gyroscope readings into motion fields named gyro field. Then, we design a self-guided fusion module to fuse the background motion extracted from the gyro field with the optical flow and guide the network to focus on motion details. To the best of our knowledge, this is the first deep learning-based framework that fuses gyroscope data and image content for optical flow learning. To validate our method, we propose a new dataset that covers regular and challenging scenes. Experiments show that our method outperforms the state-of-art methods in both regular and challenging scenes

    Survey of computer vision algorithms and applications for unmanned aerial vehicles

    Get PDF
    This paper presents a complete review of computer vision algorithms and vision-based intelligent applications, that are developed in the field of the Unmanned Aerial Vehicles (UAVs) in the latest decade. During this time, the evolution of relevant technologies for UAVs; such as component miniaturization, the increase of computational capabilities, and the evolution of computer vision techniques have allowed an important advance in the development of UAVs technologies and applications. Particularly, computer vision technologies integrated in UAVs allow to develop cutting-edge technologies to cope with aerial perception difficulties; such as visual navigation algorithms, obstacle detection and avoidance and aerial decision-making. All these expert technologies have developed a wide spectrum of application for UAVs, beyond the classic military and defense purposes. Unmanned Aerial Vehicles and Computer Vision are common topics in expert systems, so thanks to the recent advances in perception technologies, modern intelligent applications are developed to enhance autonomous UAV positioning, or automatic algorithms to avoid aerial collisions, among others. Then, the presented survey is based on artificial perception applications that represent important advances in the latest years in the expert system field related to the Unmanned Aerial Vehicles. In this paper, the most significant advances in this field are presented, able to solve fundamental technical limitations; such as visual odometry, obstacle detection, mapping and localization, et cetera. Besides, they have been analyzed based on their capabilities and potential utility. Moreover, the applications and UAVs are divided and categorized according to different criteria.This research is supported by the Spanish Government through the CICYT projects (TRA2015-63708-R and TRA2013-48314-C3-1-R)

    Insect inspired visual motion sensing and flying robots

    Get PDF
    International audienceFlying insects excellently master visual motion sensing techniques. They use dedicated motion processing circuits at a low energy and computational costs. Thanks to observations obtained on insect visual guidance, we developed visual motion sensors and bio-inspired autopilots dedicated to flying robots. Optic flow-based visuomotor control systems have been implemented on an increasingly large number of sighted autonomous robots. In this chapter, we present how we designed and constructed local motion sensors and how we implemented bio-inspired visual guidance scheme on-board several micro-aerial vehicles. An hyperacurate sensor in which retinal micro-scanning movements are performed via a small piezo-bender actuator was mounted onto a miniature aerial robot. The OSCAR II robot is able to track a moving target accurately by exploiting the microscan-ning movement imposed to its eye's retina. We also present two interdependent control schemes driving the eye in robot angular position and the robot's body angular position with respect to a visual target but without any knowledge of the robot's orientation in the global frame. This "steering-by-gazing" control strategy, which is implemented on this lightweight (100 g) miniature sighted aerial robot, demonstrates the effectiveness of this biomimetic visual/inertial heading control strategy

    Multi-Sensor Methods for Mobile Radar Motion Capture and Compensation.

    Get PDF
    Ph.D. Thesis. University of Hawaiʻi at Mānoa 2017

    Design, Development and Implementation of Intelligent Algorithms to Increase Autonomy of Quadrotor Unmanned Missions

    Get PDF
    This thesis presents the development and implementation of intelligent algorithms to increase autonomy of unmanned missions for quadrotor type UAVs. A six-degree-of freedom dynamic model of a quadrotor is developed in Matlab/Simulink in order to support the design of control algorithms previous to real-time implementation. A dynamic inversion based control architecture is developed to minimize nonlinearities and improve robustness when the system is driven outside bounds of nominal design. The design and the implementation of the control laws are described. An immunity-based architecture is introduced for monitoring quadrotor health and its capabilities for detecting abnormal conditions are successfully demonstrated through flight testing. A vision-based navigation scheme is developed to enhance the quadrotor autonomy under GPS denied environments. An optical flow sensor and a laser range finder are used within an Extended Kalman Filter for position estimation and its estimation performance is analyzed by comparing against measurements from a GPS module. Flight testing results are presented where the performances are analyzed, showing a substantial increase of controllability and tracking when the developed algorithms are used under dynamically changing environments. Healthy flights, flights with failures, flight with GPS-denied navigation and post-failure recovery are presented

    Steering by Gazing: An Efficient Biomimetic Control Strategy for Visually-guided Micro-Air Vehicles

    No full text
    International audienceOSCAR 2 is a twin-engine aerial demonstrator equipped with a monocular visual system, which manages to keep its gaze and its heading steadily fixed on a target (a dark edge or a bar) in spite of the severe random perturbations applied to its body via a ducted fan. The tethered robot stabilizes its gaze on the basis of two Oculomotor Reflexes (ORs) inspired by studies on animals: - a Visual Fixation Reflex (VFR) - a Vestibulo-ocular Reflex (VOR) One of the key features of this robot is the fact that the eye is decoupled mechanically from the body about the vertical (yaw) axis. To meet the conflicting requirements of high accuracy and fast ocular responses, a miniature (2.4-gram) Voice Coil Motor (VCM) was used, which enables the eye to make a change of orientation within an unusually short rise time (19ms). The robot, which was equipped with a high bandwidth (7Hz) "Vestibulo-Ocular Reflex (VOR)" based on an inertial micro-rate gyro, is capable of accurate visual fixation as long as there is light. The robot is also able to pursue a moving target in the presence of erratic gusts of wind. Here we present the two interdependent control schemes driving the eye in the robot and the robot in space without any knowledge of the robot's angular position. This "steering by gazing" control strategy implemented on this lightweight (100-gram) miniature aerial robot demonstrates the effectiveness of this biomimetic visual/inertial heading control strategy

    Review of Anthropomorphic Head Stabilisation and Verticality Estimation in Robots

    Get PDF
    International audienceIn many walking, running, flying, and swimming animals, including mammals, reptiles, and birds, the vestibular system plays a central role for verticality estimation and is often associated with a head sta-bilisation (in rotation) behaviour. Head stabilisation, in turn, subserves gaze stabilisation, postural control, visual-vestibular information fusion and spatial awareness via the active establishment of a quasi-inertial frame of reference. Head stabilisation helps animals to cope with the computational consequences of angular movements that complicate the reliable estimation of the vertical direction. We suggest that this strategy could also benefit free-moving robotic systems, such as locomoting humanoid robots, which are typically equipped with inertial measurements units. Free-moving robotic systems could gain the full benefits of inertial measurements if the measurement units are placed on independently orientable platforms, such as a human-like heads. We illustrate these benefits by analysing recent humanoid robots design and control approaches
    • 

    corecore