4 research outputs found

    Powerline Tracking with Event Cameras

    Full text link
    Autonomous inspection of powerlines with quadrotors is challenging. Flights require persistent perception to keep a close look at the lines. We propose a method that uses event cameras to robustly track powerlines. Event cameras are inherently robust to motion blur, have low latency, and high dynamic range. Such properties are advantageous for autonomous inspection of powerlines with drones, where fast motions and challenging illumination conditions are ordinary. Our method identifies lines in the stream of events by detecting planes in the spatio-temporal signal, and tracks them through time. The implementation runs onboard and is capable of detecting multiple distinct lines in real time with rates of up to 320 thousand events per second. The performance is evaluated in real-world flights along a powerline. The tracker is able to persistently track the powerlines, with a mean lifetime of the line 10× longer than existing approaches

    Fast, flexible closed-loop feedback: Tracking movement in “real-millisecond-time”

    Get PDF
    © 2019 Sehara et al. One of the principal functions of the brain is to control movement and rapidly adapt behavior to a changing external environment. Over the last decades our ability to monitor activity in the brain, manipulate it while also manipulating the environment the animal moves through, has been tackled with increasing sophistication. However, our ability to track the movement of the animal in real time has not kept pace. Here, we use a dynamic vision sensor (DVS) based event-driven neuromorphic camera system to implement real-time, low-latency tracking of a single whisker that mice can move at 25 Hz. The customized DVS system described here converts whisker motion into a series of events that can be used to estimate the position of the whisker and to trigger a position-based output interactively within 2 ms. This neuromorphic chip-based closed-loop system provides feedback rapidly and flexibly. With this system, it becomes possible to use the movement of whiskers or in principal, movement of any part of the body to reward, punish, in a rapidly reconfigurable way. These methods can be used to manipulate behavior, and the neural circuits that help animals adapt to changing values of a sequence of motor actions

    Low-Latency Line Tracking Using Event-Based Dynamic Vision Sensors

    No full text
    In order to safely navigate and orient in their local surroundings autonomous systems need to rapidly extract and persistently track visual features from the environment. While there are many algorithms tackling those tasks for traditional frame-based cameras, these have to deal with the fact that conventional cameras sample their environment with a fixed frequency. Most prominently, the same features have to be found in consecutive frames and corresponding features then need to be matched using elaborate techniques as any information between the two frames is lost. We introduce a novel method to detect and track line structures in data streams of event-based silicon retinae [also known as dynamic vision sensors (DVS)]. In contrast to conventional cameras, these biologically inspired sensors generate a quasicontinuous stream of vision information analogous to the information stream created by the ganglion cells in mammal retinae. All pixels of DVS operate asynchronously without a periodic sampling rate and emit a so-called DVS address event as soon as they perceive a luminance change exceeding an adjustable threshold. We use the high temporal resolution achieved by the DVS to track features continuously through time instead of only at fixed points in time. The focus of this work lies on tracking lines in a mostly static environment which is observed by a moving camera, a typical setting in mobile robotics. Since DVS events are mostly generated at object boundaries and edges which in man-made environments often form lines they were chosen as feature to track. Our method is based on detecting planes of DVS address events in x-y-t-space and tracing these planes through time. It is robust against noise and runs in real time on a standard computer, hence it is suitable for low latency robotics. The efficacy and performance are evaluated on real-world data sets which show artificial structures in an office-building using event data for tracking and frame data for ground-truth estimation from a DAVIS240C sensor

    Experimental Investigation of a MAV-Scale Cyclocopter

    Get PDF
    The development of an efficient, maneuverable, and gust tolerant hovering concept with a multi-modal locomotion capability is key to the success of micro air vehicles (MAVs) operating in multiple mission scenarios. The current research investigated performance of two unconventional cycloidal-rotor-based (cyclocopter) configurations: (1) twin-cyclocopter and (2) all-terrain cyclocopter. The twin-cyclocopter configuration used two cycloidal rotors (cyclorotors) and a smaller horizontal edge-wise nose rotor to counteract the torque produced by the cyclorotors. The all-terrain cyclocopter relied on four cyclorotors oriented in an H-configuration. Objectives of this research include the following: (1) develop control strategies to enable level forward flight of a cyclocopter purely relying on thrust vectoring, (2) identify flight dynamics model in forward flight, (3) experimentally evaluate gust tolerance strategies, and (4) determine feasibility and performance of multi-modal locomotion of the cyclocopter configuration. The forward flight control strategy for the twin-cyclocopter used a unique combination of independent thrust vectoring and rotational speed control of the cyclorotors. Unlike conventional rotary-winged vehicles, the cyclocopter propelled in forward flight by thrust vectoring instead of pitching the entire fuselage. While the strategy enabled the vehicle to maintain a level attitude in forward flight, it was accompanied by significant yaw-roll controls coupling and gyroscopic coupling. To understand these couplings and characterize the bare airframe dynamics, a 6-DOF flight dynamics model of the cyclocopter was extracted using a time-domain system identification technique. Decoupling methods involved simultaneously mixing roll and yaw inputs in the controller. After implementing the controls mixing strategy in the closed-loop feedback system, the cyclocopter successfully achieved level forward flight up to 5 m/s. Thrust vectoring capability also proved critical for gust mitigation. Thrust vectoring input combined with flow feedback and position feedback improved gust tolerance up to 4 m/s for a twin-cyclocopter mounted on a 6-DOF test stand. Flow feedback relied on a dual-axis flowprobe attached to differential pressure sensors and position feedback was based on data recorded by the VICON motion capture system. The vehicle was also able to recover initial position for crosswind scenarios tested at various side-slip angles up to 30 degrees. Unlike existing multi-modal platforms, the all-terrain cyclocopter solely relied on its four cyclorotors as main source of propulsion, as well as wheels. Aerial and aquatic modes used aerodynamic forces generated by modulating cyclorotor rotational speeds and thrust vectors while terrestrial mode used motor torque. In aerial mode, cyclorotors operated at 1550 rpm and consumed 232 W to sustain hover. In terrestrial mode, forward translation at 2 m/s required 28 W, which was an 88% reduction in power consumption required to hover. In aquatic mode, cyclorotors operated at 348 rpm to achieve 1.3 m/s translation and consumed 19 W, a 92% reduction in power consumption. With only a modest weight addition of 200 grams for wheels and retractable landing gear, the versatile cyclocopter platform achieved sustained hover, efficient translation and rotational maneuvers on ground, and aquatic locomotion
    corecore