863 research outputs found

    Free as a Bird: Event-Based Dynamic Sense-and-Avoid for Ornithopter Robot Flight

    Get PDF
    Autonomous flight of flapping-wing robots is a major challenge for robot perception. Most of the previous sense-and-avoid works have studied the problem of obstacle avoidance for flapping-wing robots considering only static obstacles. This letter presents a fully onboard dynamic sense-and-avoid scheme for large-scale ornithopters using event cameras. These sensors trigger pixel information due to changes of illumination in the scene such as those produced by dynamic objects. The method performs event-by-event processing in low-cost hardware such as those onboard small aerial vehicles. The proposed scheme detects obstacles and evaluates possible collisions with the robot body. The onboard controller actuates over the horizontal and vertical tail deflections to execute the avoidance maneuver. The scheme is validated in both indoor and outdoor scenarios using obstacles of different shapes and sizes. To the best of the authors’ knowledge, this is the first event-based method for dynamic obstacle avoidance in a flapping-wing robot.Consejo Europeo de Investigación (ERC) 788247Comisión Europea - Proyecto AERIAL-CORE H2020-2019-871479Ministerio de Universidades FPU19/0469

    A Comparison between Frame-based and Event-based Cameras for Flapping-Wing Robot Perception

    Full text link
    Perception systems for ornithopters face severe challenges. The harsh vibrations and abrupt movements caused during flapping are prone to produce motion blur and strong lighting condition changes. Their strict restrictions in weight, size, and energy consumption also limit the type and number of sensors to mount onboard. Lightweight traditional cameras have become a standard off-the-shelf solution in many flapping-wing designs. However, bioinspired event cameras are a promising solution for ornithopter perception due to their microsecond temporal resolution, high dynamic range, and low power consumption. This paper presents an experimental comparison between frame-based and an event-based camera. Both technologies are analyzed considering the particular flapping-wing robot specifications and also experimentally analyzing the performance of well-known vision algorithms with data recorded onboard a flapping-wing robot. Our results suggest event cameras as the most suitable sensors for ornithopters. Nevertheless, they also evidence the open challenges for event-based vision on board flapping-wing robots

    Why fly blind? Event-based visual guidance for ornithopter robot flight

    Get PDF
    Under licence Creative Commons - Green Open Access (IEEE).The development of perception and control methods that allow bird-scale flapping-wing robots (a.k.a. ornithopters) to perform autonomously is an under-researched area. This paper presents a fully onboard event-based method for ornithopter robot visual guidance. The method uses event cameras to exploit their fast response and robustness against motion blur in order to feed the ornithopter control loop at high rates (100 Hz). The proposed scheme visually guides the robot using line features extracted in the event image plane and controls the flight by actuating over the horizontal and vertical tail deflections. It has been validated on board a real ornithopter robot with real-time computation in low-cost hardware. The experimental evaluation includes sets of experiments with different maneuvers indoors and outdoors.Consejo Europeo de Investigación (ERC) 78824

    Power-Scavenging MEMS Robots

    Get PDF
    This thesis includes the design, modeling, and testing of novel, power-scavenging, biologically inspired MEMS microrobots. Over one hundred 500-μm and 990-μm microrobots with two, four, and eight wings were designed, fabricated, characterized. These microrobots constitute the smallest documented attempt at powered flight. Each microrobot wing is comprised of downward-deflecting, laser-powered thermal actuators made of gold and polysilicon; the microrobots were fabricated in PolyMUMPs® (Polysilicon Multi-User MEMS Processes). Characterization results of the microrobots illustrate how wing-tip deflection can be maximized by optimizing the gold-topolysilicon ratio as well as the dimensions of the actuator-wings. From these results, an optimum actuator-wing configuration was identified. It also was determined that the actuator-wing configuration with maximum deflection and surface area yet minimum mass had the greatest lift-to-weight ratio. Powered testing results showed that the microrobots successfully scavenged power from a remote 660-nm laser. These microrobots also demonstrated rapid downward flapping, but none achieved flight. The results show that the microrobots were too heavy and lacked sufficient wing surface area. It was determined that a successfully flying microrobot can be achieved by adding a robust, light-weight material to the optimum actuator-wing configuration—similar to insect wings. The ultimate objective of the flying microrobot project is an autonomous, fully maneuverable flying microrobot that is capable of sensing and acting upon a target. Such a microrobot would be capable of precise lethality, accurate battle-damage assessment, and successful penetration of otherwise inaccessible targets

    Exploring Motion Signatures for Vision-Based Tracking, Recognition and Navigation

    Get PDF
    As cameras become more and more popular in intelligent systems, algorithms and systems for understanding video data become more and more important. There is a broad range of applications, including object detection, tracking, scene understanding, and robot navigation. Besides the stationary information, video data contains rich motion information of the environment. Biological visual systems, like human and animal eyes, are very sensitive to the motion information. This inspires active research on vision-based motion analysis in recent years. The main focus of motion analysis has been on low level motion representations of pixels and image regions. However, the motion signatures can benefit a broader range of applications if further in-depth analysis techniques are developed. In this dissertation, we mainly discuss how to exploit motion signatures to solve problems in two applications: object recognition and robot navigation. First, we use bird species recognition as the application to explore motion signatures for object recognition. We begin with study of the periodic wingbeat motion of flying birds. To analyze the wing motion of a flying bird, we establish kinematics models for bird wings, and obtain wingbeat periodicity in image frames after the perspective projection. Time series of salient extremities on bird images are extracted, and the wingbeat frequency is acquired for species classification. Physical experiments show that the frequency based recognition method is robust to segmentation errors and measurement lost up to 30%. In addition to the wing motion, the body motion of the bird is also analyzed to extract the flying velocity in 3D space. An interacting multi-model approach is then designed to capture the combined object motion patterns and different environment conditions. The proposed systems and algorithms are tested in physical experiments, and the results show a false positive rate of around 20% with a low false negative rate close to zero. Second, we explore motion signatures for vision-based vehicle navigation. We discover that motion vectors (MVs) encoded in Moving Picture Experts Group (MPEG) videos provide rich information of the motion in the environment, which can be used to reconstruct the vehicle ego-motion and the structure of the scene. However, MVs suffer from high noise level. To handle the challenge, an error propagation model for MVs is first proposed. Several steps, including MV merging, plane-at-infinity elimination, and planar region extraction, are designed to further reduce noises. The extracted planes are used as landmarks in an extended Kalman filter (EKF) for simultaneous localization and mapping. Results show that the algorithm performs localization and plane mapping with a relative trajectory error below 5:1%. Exploiting the fact that MVs encodes both environment information and moving obstacles, we further propose to track moving objects at the same time of localization and mapping. This enables the two critical navigation functionalities, localization and obstacle avoidance, to be performed in a single framework. MVs are labeled as stationary or moving according to their consistency to geometric constraints. Therefore, the extracted planes are separated into moving objects and the stationary scene. Multiple EKFs are used to track the static scene and the moving objects simultaneously. In physical experiments, we show a detection rate of moving objects at 96:6% and a mean absolute localization error below 3:5 meters

    A bio-inspired flapping wing rotor of variant frequency driven by ultrasonic motor

    Get PDF
    By combining the flapping and rotary motion, a bio-inspired flapping wing rotor (FWR) is a unique kinematics of motion. It can produce a significantly greater aerodynamic lift and efficiency than mimicking the insect wings in a vertical take-off and landing (VTOL). To produce the same lift, the FWR’s flapping frequency, twist angle, and self-propelling rotational speed is significantly smaller than the insect-like flapping wings and rotors. Like its opponents, however, the effect of variant flapping frequency (VFF) of a FWR, during a flapping cycle on its aerodynamic characteristics and efficiency, remains to be evaluated. A FWR model is built to carry out experimental work. To be able to vary the flapping frequency rapidly during a stroke, an ultrasonic motor (USM) is used to drive the FWR. Experiment and numerical simulation using computational fluid dynamics (CFD) are performed in a VFF range versus the usual constant flapping frequency (CFF) cases. The measured lifting forces agree very well with the CFD results. Flapping frequency in an up-stroke is smaller than a down-stroke, and the negative lift and inertia forces can be reduced significantly. The average lift of the FWR where the motion in VFF is greater than the CFF, in the same input motor power or equivalent flapping frequency. In other words, the required power for a VFF case to produce a specified lift is less than a CFF case. For this FWR model, the optimal installation angle of the wings for high lift and efficiency is found to be 30° and the Strouhal number of the VFF cases is between 0.3–0.36. View Full-Tex
    corecore