12 research outputs found
Recommended from our members
An On-Board Visual-Based Attitude Estimation System For Unmanned Aerial Vehicle Mapping
This paper evaluates the performances of several salient feature detectors, namely; Harris detector, Minimum Eigenvalue (MinEig), Scale Invariant Feature Transform (SIFT), Maximally Stable Extremal Region (MSER), Speeded Up Robust Feature (SURF), Features from Accelerated Segment Test (FAST), and Binary Robust Scale Invariant Keypoint (BRISK), in order to assess the suitability in the application of the proposed visual-based attitude estimation system. Throughout the experiment, three main requirements have been investigated which include Time-to-Complete (TTC), detection rate, and matching rate. It was found that SURF fulfills each of the system’s requirements. Moreover, it was also found that keypoints detection capabilities affect the processing time, and the clustering patterns in the results may assist in automated inspection of correct and false matching
Toward an Autonomous Lunar Landing Based on Low-Speed Optic Flow Sensors
International audienceFor the last few decades, growing interest has returned to the quite chal-lenging task of the autonomous lunar landing. Soft landing of payloads on the lu-nar surface requires the development of new means of ensuring safe descent with strong final conditions and aerospace-related constraints in terms of mass, cost and computational resources. In this paper, a two-phase approach is presented: first a biomimetic method inspired from the neuronal and sensory system of flying insects is presented as a solution to perform safe lunar landing. In order to design an au-topilot relying only on optic flow (OF) and inertial measurements, an estimation method based on a two-sensor setup is introduced: these sensors allow us to accu-rately estimate the orientation of the velocity vector which is mandatory to control the lander's pitch in a quasi-optimal way with respect to the fuel consumption. Sec-ondly a new low-speed Visual Motion Sensor (VMS) inspired by insects' visual systems performing local angular 1-D speed measurements ranging from 1.5 • /s to 25 • /s and weighing only 2.8 g is presented. It was tested under free-flying outdoor conditions over various fields onboard an 80 kg unmanned helicopter. These pre-liminary results show that the optic flow measured despite the complex disturbances encountered closely matched the ground-truth optic flow
Taking Inspiration from Flying Insects to Navigate inside Buildings
These days, flying insects are seen as genuinely agile micro air vehicles fitted with smart sensors and also parsimonious in their use of brain resources. They are able to visually navigate in unpredictable and GPS-denied environments. Understanding how such tiny animals work would help engineers to figure out different issues relating to drone miniaturization and navigation inside buildings. To turn a drone of ~1 kg into a robot, miniaturized conventional avionics can be employed; however, this results in a loss of their flight autonomy. On the other hand, to turn a drone of a mass between ~1 g (or less) and ~500 g into a robot requires an innovative approach taking inspiration from flying insects both with regard to their flapping wing propulsion system and their sensory system based mainly on motion vision in order to avoid obstacles in three dimensions or to navigate on the basis of visual cues. This chapter will provide a snapshot of the current state of the art in the field of bioinspired optic flow sensors and optic flow-based direct feedback loops applied to micro air vehicles flying inside buildings
Optic Flow Based Visual Guidance: From Flying Insects to Miniature Aerial Vehicles
International audienc
Optic Flow Based Autopilots: Speed Control and Obstacle Avoidance
International audienceThe explicit control schemes presented here explain how insects may navigate on the sole basis of optic flow (OF) cues without requiring any distance or speed measurements: how they take off and land, follow the terrain, avoid the lateral walls in a corridor and control their forward speed automatically. The optic flow regulator, a feedback system controlling either the lift, the forward thrust or the lateral thrust, is described. Three OF regulators account for various insect flight patterns observed over the ground and over still water, under calm and windy conditions and in straight and tapered corridors. These control schemes were simulated experimentally and/or implemented onboard two types of aerial robots, a micro helicopter (MH) and a hovercraft (HO), which behaved much like insects when placed in similar environments. These robots were equipped with opto-electronic OF sensors inspired by our electrophysiological findings on houseflies' motion sensitive visual neurons. The simple, parsimonious control schemes described here require no conventional avionic devices such as range finders, groundspeed sensors or GPS receivers. They are consistent with the the neural repertoire of flying insects and meet the low avionic payload requirements of autonomous micro aerial and space vehicles
Science, technology and the future of small autonomous drones
We are witnessing the advent of a new era of robots — drones — that can autonomously fly in natural and man-made environments. These robots, often associated with defence applications, could have a major impact on civilian tasks, including transportation, communication, agriculture, disaster mitigation and environment preservation. Autonomous flight in confined spaces presents great scientific and technical challenges owing to the energetic cost of staying airborne and to the perceptual intelligence required to negotiate complex environments. We identify scientific and technological advances that are expected to translate, within appropriate regulatory frameworks, into pervasive use of autonomous drones for civilian applications
Insect inspired visual motion sensing and flying robots
International audienceFlying insects excellently master visual motion sensing techniques. They use dedicated motion processing circuits at a low energy and computational costs. Thanks to observations obtained on insect visual guidance, we developed visual motion sensors and bio-inspired autopilots dedicated to flying robots. Optic flow-based visuomotor control systems have been implemented on an increasingly large number of sighted autonomous robots. In this chapter, we present how we designed and constructed local motion sensors and how we implemented bio-inspired visual guidance scheme on-board several micro-aerial vehicles. An hyperacurate sensor in which retinal micro-scanning movements are performed via a small piezo-bender actuator was mounted onto a miniature aerial robot. The OSCAR II robot is able to track a moving target accurately by exploiting the microscan-ning movement imposed to its eye's retina. We also present two interdependent control schemes driving the eye in robot angular position and the robot's body angular position with respect to a visual target but without any knowledge of the robot's orientation in the global frame. This "steering-by-gazing" control strategy, which is implemented on this lightweight (100 g) miniature sighted aerial robot, demonstrates the effectiveness of this biomimetic visual/inertial heading control strategy
The role of direction-selective visual interneurons T4 and T5 in Drosophila orientation behavior
In order to safely move through the environment, visually-guided animals
use several types of visual cues for orientation. Optic flow provides faithful
information about ego-motion and can thus be used to maintain a straight
course. Additionally, local motion cues or landmarks indicate potentially
interesting targets or signal danger, triggering approach or avoidance, respectively.
The visual system must reliably and quickly evaluate these cues
and integrate this information in order to orchestrate behavior. The underlying
neuronal computations for this remain largely inaccessible in higher
organisms, such as in humans, but can be studied experimentally in more
simple model species. The fly Drosophila, for example, heavily relies on
such visual cues during its impressive flight maneuvers. Additionally, it is
genetically and physiologically accessible. Hence, it can be regarded as an
ideal model organism for exploring neuronal computations during visual
processing.
In my PhD studies, I have designed and built several autonomous virtual
reality setups to precisely measure visual behavior of walking flies. The
setups run in open-loop and in closed-loop configuration. In an open-loop
experiment, the visual stimulus is clearly defined and does not depend on
the behavioral response. Hence, it allows mapping of how specific features
of simple visual stimuli are translated into behavioral output, which can
guide the creation of computational models of visual processing. In closedloop
experiments, the behavioral response is fed back onto the visual stimulus,
which permits characterization of the behavior under more realistic
conditions and, thus, allows for testing of the predictive power of the computational
models.
In addition, Drosophila’s genetic toolbox provides various strategies for
targeting and silencing specific neuron types, which helps identify which
cells are needed for a specific behavior. We have focused on visual interneuron
types T4 and T5 and assessed their role in visual orientation behavior.
These neurons build up a retinotopic array and cover the whole visual field
of the fly. They constitute major output elements from the medulla and have
long been speculated to be involved in motion processing.
This cumulative thesis consists of three published studies: In the first
study, we silenced both T4 and T5 neurons together and found that such flies
were completely blind to any kind of motion. In particular, these flies could
not perform an optomotor response anymore, which means that they lost
their normally innate following responses to motion of large-field moving
patterns. This was an important finding as it ruled out the contribution
of another system for motion vision-based behaviors. However, these flies
were still able to fixate a black bar. We could show that this behavior is
mediated by a T4/T5-independent flicker detection circuitry which exists in
parallel to the motion system.
In the second study, T4 and T5 neurons were characterized via twophoton
imaging, revealing that these cells are directionally selective and
have very similar temporal and orientation tuning properties to directionselective
neurons in the lobula plate. T4 and T5 cells responded in a
contrast polarity-specific manner: T4 neurons responded selectively to ON
edge motion while T5 neurons responded only to OFF edge motion. When
we blocked T4 neurons, behavioral responses to moving ON edges were
more impaired than those to moving OFF edges and the opposite was true
for the T5 block. Hence, these findings confirmed that the contrast polarityspecific
visual motion pathways, which start at the level of L1 (ON) and L2
(OFF), are maintained within the medulla and that motion information is
computed twice independently within each of these pathways.
Finally, in the third study, we used the virtual reality setups to probe the
performance of an artificial microcircuit. The system was equipped with a
camera and spherical fisheye lens. Images were processed by an array of
Reichardt detectors whose outputs were integrated in a similar way to what
is found in the lobula plate of flies. We provided the system with several rotating
natural environments and found that the fly-inspired artificial system
could accurately predict the axes of rotation