607 research outputs found

    Event-based Vision: A Survey

    Get PDF
    Event cameras are bio-inspired sensors that differ from conventional frame cameras: Instead of capturing images at a fixed rate, they asynchronously measure per-pixel brightness changes, and output a stream of events that encode the time, location and sign of the brightness changes. Event cameras offer attractive properties compared to traditional cameras: high temporal resolution (in the order of microseconds), very high dynamic range (140 dB vs. 60 dB), low power consumption, and high pixel bandwidth (on the order of kHz) resulting in reduced motion blur. Hence, event cameras have a large potential for robotics and computer vision in challenging scenarios for traditional cameras, such as low-latency, high speed, and high dynamic range. However, novel methods are required to process the unconventional output of these sensors in order to unlock their potential. This paper provides a comprehensive overview of the emerging field of event-based vision, with a focus on the applications and the algorithms developed to unlock the outstanding properties of event cameras. We present event cameras from their working principle, the actual sensors that are available and the tasks that they have been used for, from low-level vision (feature detection and tracking, optic flow, etc.) to high-level vision (reconstruction, segmentation, recognition). We also discuss the techniques developed to process events, including learning-based techniques, as well as specialized processors for these novel sensors, such as spiking neural networks. Additionally, we highlight the challenges that remain to be tackled and the opportunities that lie ahead in the search for a more efficient, bio-inspired way for machines to perceive and interact with the world

    Aircraft Attitude Estimation Using Panoramic Images

    Full text link
    This thesis investigates the problem of reliably estimating attitude from panoramic imagery in cluttered environments. Accurate attitude is an essential input to the stabilisation systems of autonomous aerial vehicles. A new camera system which combines a CCD camera, UltraViolet (UV) filters and a panoramic mirror-lens is designed. Drawing on biological inspiration from the Ocelli organ possessed by certain insects, UV filtered images are used to enhance the contrast between the sky and ground and mitigate the effect of the sun. A novel method for real–time horizon-based attitude estimation using panoramic image that is capable of estimating an aircraft pitch and roll at a low altitude in the presence of sun, clouds and occluding features such as tree, building, is developed. Also, a new method for panoramic sky/ground thresholding, consisting of a horizon– and a sun–tracking system which works effectively even when the horizon line is difficult to detect by normal thresholding methods due to flares and other effects from the presence of the sun in the image, is proposed. An algorithm for estimating the attitude from three–dimensional mapping of the horizon projected onto a 3D plane is developed. The use of optic flow to determine pitch and roll rates is investigated using the panoramic image and image interpolation algorithm (I2A). Two methods which employ sensor fusion techniques, Extended Kalman Filter (EKF) and Artificial Neural Networks (ANNs), are used to fuse unfiltered measurements from inertial sensors and the vision system. The EKF estimates gyroscope biases and also the attitude. The ANN fuses the optic flow and horizon–based attitude to provide smooth attitude estimations. The results obtained from different parts of the research are tested and validated through simulations and real flight tests

    Insect-Inspired Visual Perception for Flight Control and Collision Avoidance

    Get PDF
    Flying robots are increasingly used for tasks such as aerial mapping, fast exploration, video footage and monitoring of buildings. Autonomous flight at low altitude in cluttered and unknown environments is an active research topic because it poses challenging perception and control problems. Traditional methods for collision-free navigation at low altitude require heavy resources to deal with the complexity of natural environments, something that limits the autonomy and the payload of flying robots. Flying insects, however, are able to navigate safely and efficiently using vision as the main sensory modality. Flying insects rely on low resolution, high refresh rate, and wide-angle compound eyes to extract angular image motion and move in unstructured environments. These strategies result in systems that are physically and computationally lighter than those often found in high-definition stereovision. Taking inspiration from insects offers great potential for building small flying robots capable of navigating in cluttered environments using lightweight vision sensors. In this thesis, we investigate insect perception of visual motion and insect vision based flight control in cluttered environments. We use the knowledge gained through the modelling of neural circuits and behavioural experiments to develop flying robots with insect-inspired control strategies for goal-oriented navigation in complex environments. We start by exploring insect perception of visual motion. We present a study that reconciles an apparent contradiction in the literature for insect visual control: current models developed to explain insect flight behaviour rely on the measurement of optic flow, however the most prominent neural model for visual motion extraction (the Elementary Motion Detector, or EMD) does not measure optic flow. We propose a model for unbiased optic flow estimation that relies on comparing the output of multiple EMDs pointed in varying viewing directions. Our model is of interest of both engineers and biologists because it is computationally more efficient than other optic flow estimation algorithms, and because it represents a biologically plausible model for optic flow extraction in insect neural systems. We then focus on insect flight control strategies in the presence of obstacles. By recording the trajectories of bumblebees (Bombus terrestris), and by comparing them to simulated flights, we show that bumblebees rely primarily on the frontal part of their field of view, and that they pool optic flow in two different manners for the control of flight speed and of lateral position. For the control of lateral position, our results suggest that bumblebees selectively react to the portions of the visual field where optic flow is the highest, which correspond to the closest obstacles. Finally, we tackle goal-oriented navigation with a novel algorithm that combines aspects of insect perception and flight control presented in this thesis -- like the detection of fastest moving objects in the frontal visual field -- with other aspects of insect flight known from the literature such as saccadic flight pattern. Through simulations, we demonstrate autonomous navigation in forest-like environments using only local optic flow information and assuming knowledge about the direction to the navigation goal

    Depth information in natural environments derived from optic flow by insect motion detection system: a model analysis

    Get PDF
    Knowing the depth structure of the environment is crucial for moving animals in many behavioral contexts, such as collision avoidance, targeting objects, or spatial navigation. An important source of depth information is motion parallax. This powerful cue is generated on the eyes during translatory self-motion with the retinal images of nearby objects moving faster than those of distant ones. To investigate how the visual motion pathway represents motion-based depth information we analyzed its responses to image sequences recorded in natural cluttered environments with a wide range of depth structures. The analysis was done on the basis of an experimentally validated model of the visual motion pathway of insects, with its core elements being correlation-type elementary motion detectors (EMDs). It is the key result of our analysis that the absolute EMD responses, i.e. the motion energy profile, represent the contrast-weighted nearness of environmental structures during translatory self-motion at a roughly constant velocity. In other words, the output of the EMD array highlights contours of nearby objects. This conclusion is largely independent of the scale over which EMDs are spatially pooled and was corroborated by scrutinizing the motion energy profile after eliminating the depth structure from the natural image sequences. Hence, the well-established dependence of correlation-type EMDs on both velocity and textural properties of motion stimuli appears to be advantageous for representing behaviorally relevant information about the environment in a computationally parsimonious way

    Texture dependence of motion sensing and free flight behavior in blowflies

    Get PDF
    Lindemann JP, Egelhaaf M. Texture dependence of motion sensing and free flight behavior in blowflies. Frontiers in Behavioral Neuroscience. 2013;6:92.Many flying insects exhibit an active flight and gaze strategy: purely translational flight segments alternate with quick turns called saccades. To generate such a saccadic flight pattern, the animals decide the timing, direction, and amplitude of the next saccade during the previous translatory intersaccadic interval. The information underlying these decisions is assumed to be extracted from the retinal image displacements (optic flow), which scale with the distance to objects during the intersaccadic flight phases. In an earlier study we proposed a saccade-generation mechanism based on the responses of large-field motion-sensitive neurons. In closed-loop simulations we achieved collision avoidance behavior in a limited set of environments but observed collisions in others. Here we show by open-loop simulations that the cause of this observation is the known texture-dependence of elementary motion detection in flies, reflected also in the responses of large-field neurons as used in our model. We verified by electrophysiological experiments that this result is not an artifact of the sensory model. Already subtle changes in the texture may lead to qualitative differences in the responses of both our model cells and their biological counterparts in the fly's brain. Nonetheless, free flight behavior of blowflies is only moderately affected by such texture changes. This divergent texture dependence of motion-sensitive neurons and behavioral performance suggests either mechanisms that compensate for the texture dependence of the visual motion pathway at the level of the circuits generating the saccadic turn decisions or the involvement of a hypothetical parallel pathway in saccadic control that provides the information for collision avoidance independent of the textural properties of the environment

    Spatial vision in insects is facilitated by shaping the dynamics of visual input through behavioral action

    Get PDF
    Egelhaaf M, Boeddeker N, Kern R, Kurtz R, Lindemann JP. Spatial vision in insects is facilitated by shaping the dynamics of visual input through behavioral action. Frontiers in Neural Circuits. 2012;6:108.Insects such as flies or bees, with their miniature brains, are able to control highly aerobatic flight maneuvres and to solve spatial vision tasks, such as avoiding collisions with obstacles, landing on objects, or even localizing a previously learnt inconspicuous goal on the basis of environmental cues. With regard to solving such spatial tasks, these insects still outperform man-made autonomous flying systems. To accomplish their extraordinary performance, flies and bees have been shown by their characteristic behavioral actions to actively shape the dynamics of the image flow on their eyes ("optic flow"). The neural processing of information about the spatial layout of the environment is greatly facilitated by segregating the rotational from the translational optic flow component through a saccadic flight and gaze strategy. This active vision strategy thus enables the nervous system to solve apparently complex spatial vision tasks in a particularly efficient and parsimonious way. The key idea of this review is that biological agents, such as flies or bees, acquire at least part of their strength as autonomous systems through active interactions with their environment and not by simply processing passively gained information about the world. These agent-environment interactions lead to adaptive behavior in surroundings of a wide range of complexity. Animals with even tiny brains, such as insects, are capable of performing extraordinarily well in their behavioral contexts by making optimal use of the closed action-perception loop. Model simulations and robotic implementations show that the smart biological mechanisms of motion computation and visually-guided flight control might be helpful to find technical solutions, for example, when designing micro air vehicles carrying a miniaturized, low-weight on-board processor
    corecore