824 research outputs found

    A Robust Analog VLSI Motion Sensor Based on the Visual System of the Fly

    Get PDF
    Sensing visual motion gives a creature valuable information about its interactions with the environment. Flies in particular use visual motion information to navigate through turbulent air, avoid obstacles, and land safely. Mobile robots are ideal candidates for using this sensory modality to enhance their performance, but so far have been limited by the computational expense of processing video. Also, the complex structure of natural visual scenes poses an algorithmic challenge for extracting useful information in a robust manner. We address both issues by creating a small, low-power visual sensor with integrated analog parallel processing to extract motion in real-time. Because our architecture is based on biological motion detectors, we gain the advantages of this highly evolved system: A design that robustly and continuously extracts relevant information from its visual environment. We show that this sensor is suitable for use in the real world, and demonstrate its ability to compensate for an imperfect motor system in the control of an autonomous robot. The sensor attenuates open-loop rotation by a factor of 31 with less than 1 mW power dissipation

    A Robust Analog VLSI Reichardt Motion Sensor

    Get PDF
    Silicon imagers with integrated motion-detection circuitry have been developed and tested for the past 15 years. Many previous circuits estimate motion by identifying and tracking spatial or temporal features. These approaches are prone to failure at low SNR conditions, where feature detection becomes unreliable. An alternate approach to motion detection is an intensity-based spatiotemporal correlation algorithm, such as the one proposed by Hassenstein and Reichardt in 1956 to explain aspects of insect vision. We implemented a Reichardt motion sensor with integrated photodetectors in a standard CMOS process. Our circuit operates at sub-microwatt power levels, the lowest reported for any motion sensor. We measure the effects of device mismatch on these parallel, analog circuits to show they are suitable for constructing 2-D VLSI arrays. Traditional correlation-based sensors suffer from strong contrast dependence. We introduce a circuit architecture that lessens this dependence. We also demonstrate robust performance of our sensor to complex stimuli in the presence of spatial and temporal noise

    A two-directional 1-gram visual motion sensor inspired by the fly's eye

    No full text
    International audienceOptic flow based autopilots for Micro-Aerial Vehicles (MAVs) need lightweight, low-power sensors to be able to fly safely through unknown environments. The new tiny 6-pixel visual motion sensor presented here meets these demanding requirements in term of its mass, size and power consumption. This 1-gram, low-power, fly-inspired sensor accurately gauges the visual motion using only this 6-pixel array with two different panoramas and illuminance conditions. The new visual motion sensor's output results from a smart combination of the information collected by several 2-pixel Local Motion Sensors (LMSs), based on the \enquote{time of travel} scheme originally inspired by the common housefly's Elementary Motion Detector (EMD) neurons. The proposed sensory fusion method enables the new visual sensor to measure the visual angular speed and determine the main direction of the visual motion without any prior knowledge. By computing the median value of the output from several LMSs, we also ended up with a more robust, more accurate and more frequently refreshed measurement of the 1-D angular speed

    Biologically inspired analog IC for visual collision detection

    Get PDF
    Journal ArticleWe have designed and tested a single-chip analog VLSI sensor that detects imminent collisions by measuring radially expanding optic flow. The design of the chip is based on a model proposed to explain leg-extension behavior in flies during landing approaches. We evaluated a detailed version of this model in simulation using a library of 50 test movies taken through a fisheye lens. The algorithm was evaluated on its ability to distinguish movies ending in collisions from movies in which no collision occurred. This biologically inspired algorithm is capable of 94% correct performance in this task using an ultra-low-resolution (132-pixel) image as input. A new elementary motion detector (EMD) circuit was developed to measure optic flow on a CMOS focal-plane sensor. This EMD circuit models the bandpass nature of large monopolar cells (LMCs) immediately postsynaptic to photoreceptors in the fly visual system as well as a saturating multiplication operation proposed for Reichart-type motion detectors. A 16 x 16 array of two-dimensional motion detectors was fabricated in a standard 0.5µm CMOS process. The chip consumes 140 µW of power from a 5 V supply. With the addition of wide-angle optics, the sensor is able to detect collisions 100-400 ms before impact in complex, real-world scenes. Index Terms-CMOS imager, collision detection, Gilbert multiplier, insect vision, neuromorphic systems, optic flow, smart sensor

    Insect inspired visual motion sensing and flying robots

    Get PDF
    International audienceFlying insects excellently master visual motion sensing techniques. They use dedicated motion processing circuits at a low energy and computational costs. Thanks to observations obtained on insect visual guidance, we developed visual motion sensors and bio-inspired autopilots dedicated to flying robots. Optic flow-based visuomotor control systems have been implemented on an increasingly large number of sighted autonomous robots. In this chapter, we present how we designed and constructed local motion sensors and how we implemented bio-inspired visual guidance scheme on-board several micro-aerial vehicles. An hyperacurate sensor in which retinal micro-scanning movements are performed via a small piezo-bender actuator was mounted onto a miniature aerial robot. The OSCAR II robot is able to track a moving target accurately by exploiting the microscan-ning movement imposed to its eye's retina. We also present two interdependent control schemes driving the eye in robot angular position and the robot's body angular position with respect to a visual target but without any knowledge of the robot's orientation in the global frame. This "steering-by-gazing" control strategy, which is implemented on this lightweight (100 g) miniature sighted aerial robot, demonstrates the effectiveness of this biomimetic visual/inertial heading control strategy

    Analog VLSI implementation of a visual interneuron: enhanced sensory processing through biophysical modeling

    Get PDF
    Journal ArticleFlies are capable of rapid, coordinated flight through unstructured environments. This flight is guided by visual motion information that is extracted from photoreceptors in a robust manner. One feature of the fly's visual processing that adds to this robustness is the saturation of wide-fi_x000C_eld motion-sensitive neuron responses with increasing pattern size. This makes the cell's responses less dependent on the sparseness of the optical ow fi_x000C_eld while retaining motion information. By implementing a compartmental neuronal model in silicon, we add this "gain control" to an existing analog VLSI model of fly vision. This results in enhanced performance in a compact, low-power CMOS motion sensor. Our silicon system also demonstrates that modern, biophysically-detailed models of neural sensory processing systems can be instantiated in VLSI hardware

    Research issues in biological inspired sensors for flying robots

    Get PDF
    Biological inspired robotics is an area experiencing an increasing research and development. In spite of all the recent engineering advances, robots still lack capabilities with respect to agility, adaptability, intelligent sensing, fault-tolerance, stealth, and utilization of in-situ resources for power when compared to biological organisms. The general premise of bio-inspired engineering is to distill the principles incorporated in successful, nature-tested mechanisms of selected features and functional behaviors that can be captured through biomechatronic designs and minimalist operation principles from nature success strategies. Based on these concepts, robotics researchers are interested in gaining an understanding of the sensory aspects that would be required to mimic nature design with engineering solutions. In this paper are analysed developments in this area and the research aspects that have to be further studied are discussed.N/

    Exploiting Device Mismatch in Neuromorphic VLSI Systems to Implement Axonal Delays

    Get PDF
    Sheik S, Chicca E, Indiveri G. Exploiting Device Mismatch in Neuromorphic VLSI Systems to Implement Axonal Delays. Presented at the International Joint Conference on Neural Networks (IJCNN), Brisbane, Australia.Axonal delays are used in neural computation to implement faithful models of biological neural systems, and in spiking neural networks models to solve computationally demanding tasks. While there is an increasing number of software simulations of spiking neural networks that make use of axonal delays, only a small fraction of currently existing hardware neuromorphic systems supports them. In this paper we demonstrate a strategy to implement temporal delays in hardware spiking neural networks distributed across multiple Very Large Scale Integration (VLSI) chips. This is achieved by exploiting the inherent device mismatch present in the analog circuits that implement silicon neurons and synapses inside the chips, and the digital communication infrastructure used to configure the network topology and transmit the spikes across chips. We present an example of a recurrent VLSI spiking neural network that employs axonal delays and demonstrate how the proposed strategy efficiently implements them in hardware
    corecore