23 research outputs found

    A mouse sensor and a 2-pixel motion sensor exposed to continuous illuminance changes

    No full text
    International audienceConsiderable attention has been paid during the last decade to navigation systems based on the use of visual optic flow cues, especially for guiding autonomous robots designed to travel under specific lighting conditions. In the present study, the performances of two visual motion sensors used to measure a local 1-D angular speed, namely (i) a bio-inspired 2-pixel motion sensor and (ii) an off-the-shelf mouse sensor, were tested for the first time in a wide range of illuminance levels. The sensors' characteristics were determined here by recording their responses to a purely rotational optic flow generated by rotating the sensors mechanically and comparing their responses with an accurate rate gyro output signal. The refresh rate, a key parameter for future optic flow-based robotic applications, was also defined and tested in these two sensors. The bio-inspired 2-pixel motion sensor was found to be more accurate indoors whereas the mouse sensor was found to be more efficient outdoors

    A two-directional 1-gram visual motion sensor inspired by the fly's eye

    No full text
    International audienceOptic flow based autopilots for Micro-Aerial Vehicles (MAVs) need lightweight, low-power sensors to be able to fly safely through unknown environments. The new tiny 6-pixel visual motion sensor presented here meets these demanding requirements in term of its mass, size and power consumption. This 1-gram, low-power, fly-inspired sensor accurately gauges the visual motion using only this 6-pixel array with two different panoramas and illuminance conditions. The new visual motion sensor's output results from a smart combination of the information collected by several 2-pixel Local Motion Sensors (LMSs), based on the \enquote{time of travel} scheme originally inspired by the common housefly's Elementary Motion Detector (EMD) neurons. The proposed sensory fusion method enables the new visual sensor to measure the visual angular speed and determine the main direction of the visual motion without any prior knowledge. By computing the median value of the output from several LMSs, we also ended up with a more robust, more accurate and more frequently refreshed measurement of the 1-D angular speed

    Insect inspired visual motion sensing and flying robots

    Get PDF
    International audienceFlying insects excellently master visual motion sensing techniques. They use dedicated motion processing circuits at a low energy and computational costs. Thanks to observations obtained on insect visual guidance, we developed visual motion sensors and bio-inspired autopilots dedicated to flying robots. Optic flow-based visuomotor control systems have been implemented on an increasingly large number of sighted autonomous robots. In this chapter, we present how we designed and constructed local motion sensors and how we implemented bio-inspired visual guidance scheme on-board several micro-aerial vehicles. An hyperacurate sensor in which retinal micro-scanning movements are performed via a small piezo-bender actuator was mounted onto a miniature aerial robot. The OSCAR II robot is able to track a moving target accurately by exploiting the microscan-ning movement imposed to its eye's retina. We also present two interdependent control schemes driving the eye in robot angular position and the robot's body angular position with respect to a visual target but without any knowledge of the robot's orientation in the global frame. This "steering-by-gazing" control strategy, which is implemented on this lightweight (100 g) miniature sighted aerial robot, demonstrates the effectiveness of this biomimetic visual/inertial heading control strategy

    Automatically calibrating the viewing direction of optic-flow sensors

    Get PDF
    Because of their low weight, cost and energy consumption, optic-flow sensors attract growing interest in robotics for tasks such as self-motion estimation or depth measurement. Most applications require a large number of these sensors, which involves a fair amount of calibration work for each setup. In particular, the viewing direction of each sensor has to be measured for proper operation. This task is often cumbersome and prone to errors, and has to be carried out every time the setup is slightly modified. This paper proposes an algorithm for viewing direction calibration relying on rate gyroscope readings and a recursive weighted linear least square estimation of the rotation matrix elements. The method only requires the user to realize random rotational motions of its setup by hand. The algorithm provides hints about the current precision of the estimation and what motions should be performed to improve it. To assess the validity of the method, tests were performed on an experimental setup and the results compared to a precise manual calibration. The repeatability of the gyroscope-based calibration process reached ±1.7° per axis

    optiPilot: control of take-off and landing using optic flow

    Get PDF
    Take-off and landing manoeuvres are critical for MAVs because GPS-based autopilots usually do not perceive distance to the ground or other potential obstacles. In addition, attitude estimates based on inertial sensors are often perturbed by the strong accelerations occurring during launch. This paper shows how our previously developed control strategy, called optiPilot, can cope with take-off and landing using a small set of inexpensive optic flow sensors

    Method for fabricating an artificial compound eye

    Get PDF
    A method for fabricating an imaging system, the method comprising providing a flexible substrate (200), a first layer (220) comprising a plurality of microlenses (232) and a second layer (240) comprising a plurality of image sensors (242). The method further comprises stacking the first and the second layer (220; 240) onto the flexible substrate (200) by attaching the plurality of image sensors (242) to the flexible substrate, such that each of the plurality of microlenses (232) and image sensors (242) are aligned to form a plurality of optical channels (300) , each optical channel comprising at least one microlens and at least one associated image sensor, and mechanically separating the optical channels (300) such that the separated optical channels remain attached to the flexible substrate (200) to form a mechanically flexible imaging system

    The AirBurr: A Flying Robot That Can Exploit Collisions

    Get PDF
    Research made over the past decade shows the use of increasingly complex methods and heavy platforms to achieve autonomous flight in cluttered environments. However, efficient behaviors can be found in nature where limited sensing is used, such as in insects progressing toward a light at night. Interestingly, their success is based on their ability to recover from the numerous collisions happening along their imperfect flight path. The goal of the AirBurr project is to take inspiration from these insects and develop a new class of flying robots that can recover from collisions and even exploit them. Such robots are designed to be robust to crashes and can take-off again without human intervention. They navigate in a reactive way and, unlike conventional approaches, they don't need heavy modelling in order to fly autonomously. We believe that this new paradigm will bring flying robots out of the laboratory environment and allow them to tackle unstructured, cluttered environments. This paper aims at presenting the vision of the AirBurr project, as well as the latest results in the design of a platform capable of sustaining collisions and self-recovering after crashes

    Vision-based control of near-obstacle flight

    Get PDF
    Lightweight micro unmanned aerial vehicles (micro-UAVs) capable of autonomous flight in natural and urban environments have a large potential for civil and commercial applications, including environmental monitoring, forest fire monitoring, homeland security, traffic monitoring, aerial imagery, mapping and search and rescue. Smaller micro-UAVs capable of flying inside houses or small indoor environments have further applications in the domain of surveillance, search and rescue and entertainment. These applications require the capability to fly near to the ground and amongst obstacles. Existing UAVs rely on GPS and AHRS (attitude heading reference system) to control their flight and are unable to detect and avoid obstacles. Active distance sensors such as radars or laser range finders could be used to measure distances to obstacles, but are typically too heavy and power-consuming to be embedded on lightweight systems. In this thesis, we draw inspiration from biology and explore alternative approaches to flight control that allow aircraft to fly near obstacles. We show that optic flow can be used on flying platforms to estimate the proximity of obstacles and propose a novel control strategy, called optiPilot, for vision-based near-obstacle flight. Thanks to optiPilot, we demonstrate for the first time autonomous near-obstacle flight of micro-UAVs, both indoor and outdoor, without relying on an AHRS nor external beacons such as GPS. The control strategy only requires a small series of optic flow sensors, two rate gyroscopes and an airspeed sensor. It can run on a tiny embedded microcontroller in realtime. Despite its simplicity, optiPilot is able to fully control the aircraft, including altitude regulation, attitude stabilisation, obstacle avoidance, landing and take-off. This parsimony, inherited from the biology of flying insects, contrasts with the complexity of the systems used so far for flight control while offering more capabilities. The results presented in this thesis contribute to a better understanding of the minimal requirements, in terms of sensing and control architecture, that enable animals and artificial systems to fly and bring closer to reality the perspective of using lightweight and inexpensive micro-UAV for civilian purposes

    The role of direction-selective visual interneurons T4 and T5 in Drosophila orientation behavior

    Get PDF
    In order to safely move through the environment, visually-guided animals use several types of visual cues for orientation. Optic flow provides faithful information about ego-motion and can thus be used to maintain a straight course. Additionally, local motion cues or landmarks indicate potentially interesting targets or signal danger, triggering approach or avoidance, respectively. The visual system must reliably and quickly evaluate these cues and integrate this information in order to orchestrate behavior. The underlying neuronal computations for this remain largely inaccessible in higher organisms, such as in humans, but can be studied experimentally in more simple model species. The fly Drosophila, for example, heavily relies on such visual cues during its impressive flight maneuvers. Additionally, it is genetically and physiologically accessible. Hence, it can be regarded as an ideal model organism for exploring neuronal computations during visual processing. In my PhD studies, I have designed and built several autonomous virtual reality setups to precisely measure visual behavior of walking flies. The setups run in open-loop and in closed-loop configuration. In an open-loop experiment, the visual stimulus is clearly defined and does not depend on the behavioral response. Hence, it allows mapping of how specific features of simple visual stimuli are translated into behavioral output, which can guide the creation of computational models of visual processing. In closedloop experiments, the behavioral response is fed back onto the visual stimulus, which permits characterization of the behavior under more realistic conditions and, thus, allows for testing of the predictive power of the computational models. In addition, Drosophila’s genetic toolbox provides various strategies for targeting and silencing specific neuron types, which helps identify which cells are needed for a specific behavior. We have focused on visual interneuron types T4 and T5 and assessed their role in visual orientation behavior. These neurons build up a retinotopic array and cover the whole visual field of the fly. They constitute major output elements from the medulla and have long been speculated to be involved in motion processing. This cumulative thesis consists of three published studies: In the first study, we silenced both T4 and T5 neurons together and found that such flies were completely blind to any kind of motion. In particular, these flies could not perform an optomotor response anymore, which means that they lost their normally innate following responses to motion of large-field moving patterns. This was an important finding as it ruled out the contribution of another system for motion vision-based behaviors. However, these flies were still able to fixate a black bar. We could show that this behavior is mediated by a T4/T5-independent flicker detection circuitry which exists in parallel to the motion system. In the second study, T4 and T5 neurons were characterized via twophoton imaging, revealing that these cells are directionally selective and have very similar temporal and orientation tuning properties to directionselective neurons in the lobula plate. T4 and T5 cells responded in a contrast polarity-specific manner: T4 neurons responded selectively to ON edge motion while T5 neurons responded only to OFF edge motion. When we blocked T4 neurons, behavioral responses to moving ON edges were more impaired than those to moving OFF edges and the opposite was true for the T5 block. Hence, these findings confirmed that the contrast polarityspecific visual motion pathways, which start at the level of L1 (ON) and L2 (OFF), are maintained within the medulla and that motion information is computed twice independently within each of these pathways. Finally, in the third study, we used the virtual reality setups to probe the performance of an artificial microcircuit. The system was equipped with a camera and spherical fisheye lens. Images were processed by an array of Reichardt detectors whose outputs were integrated in a similar way to what is found in the lobula plate of flies. We provided the system with several rotating natural environments and found that the fly-inspired artificial system could accurately predict the axes of rotation
    corecore