31,032 research outputs found

    Space optical instruments optimisation thanks to CMOS image sensor technology

    Get PDF
    Today, both CCD and CMOS sensors can be envisaged for nearly all visible sensors and instruments designed for space needs. Indeed, detectors built with both technologies allow excellent electro-optics performances to be reached, the selection of the most adequate device being driven by their functional and technological features and limits. The first part of the paper presents electro-optics characterisation results of CMOS Image Sensors (CIS) built with an optimised CMOS process, demonstrating the large improvements of CIS electro-optics performances. The second part reviews the advantages of CMOS technology for space applications, illustrated by examples of CIS developments performed by EADS Astrium and Supaéro/CIMI for current and short term coming space programs

    An Event-Driven Multi-Kernel Convolution Processor Module for Event-Driven Vision Sensors

    Get PDF
    Event-Driven vision sensing is a new way of sensing visual reality in a frame-free manner. This is, the vision sensor (camera) is not capturing a sequence of still frames, as in conventional video and computer vision systems. In Event-Driven sensors each pixel autonomously and asynchronously decides when to send its address out. This way, the sensor output is a continuous stream of address events representing reality dynamically continuously and without constraining to frames. In this paper we present an Event-Driven Convolution Module for computing 2D convolutions on such event streams. The Convolution Module has been designed to assemble many of them for building modular and hierarchical Convolutional Neural Networks for robust shape and pose invariant object recognition. The Convolution Module has multi-kernel capability. This is, it will select the convolution kernel depending on the origin of the event. A proof-of-concept test prototype has been fabricated in a 0.35 m CMOS process and extensive experimental results are provided. The Convolution Processor has also been combined with an Event-Driven Dynamic Vision Sensor (DVS) for high-speed recognition examples. The chip can discriminate propellers rotating at 2 k revolutions per second, detect symbols on a 52 card deck when browsing all cards in 410 ms, or detect and follow the center of a phosphor oscilloscope trace rotating at 5 KHz.Unión Europea 216777 (NABAB)Ministerio de Ciencia e Innovación TEC2009-10639-C04-0

    Independent Motion Detection with Event-driven Cameras

    Full text link
    Unlike standard cameras that send intensity images at a constant frame rate, event-driven cameras asynchronously report pixel-level brightness changes, offering low latency and high temporal resolution (both in the order of micro-seconds). As such, they have great potential for fast and low power vision algorithms for robots. Visual tracking, for example, is easily achieved even for very fast stimuli, as only moving objects cause brightness changes. However, cameras mounted on a moving robot are typically non-stationary and the same tracking problem becomes confounded by background clutter events due to the robot ego-motion. In this paper, we propose a method for segmenting the motion of an independently moving object for event-driven cameras. Our method detects and tracks corners in the event stream and learns the statistics of their motion as a function of the robot's joint velocities when no independently moving objects are present. During robot operation, independently moving objects are identified by discrepancies between the predicted corner velocities from ego-motion and the measured corner velocities. We validate the algorithm on data collected from the neuromorphic iCub robot. We achieve a precision of ~ 90 % and show that the method is robust to changes in speed of both the head and the target.Comment: 7 pages, 6 figure

    Event-based Vision: A Survey

    Get PDF
    Event cameras are bio-inspired sensors that differ from conventional frame cameras: Instead of capturing images at a fixed rate, they asynchronously measure per-pixel brightness changes, and output a stream of events that encode the time, location and sign of the brightness changes. Event cameras offer attractive properties compared to traditional cameras: high temporal resolution (in the order of microseconds), very high dynamic range (140 dB vs. 60 dB), low power consumption, and high pixel bandwidth (on the order of kHz) resulting in reduced motion blur. Hence, event cameras have a large potential for robotics and computer vision in challenging scenarios for traditional cameras, such as low-latency, high speed, and high dynamic range. However, novel methods are required to process the unconventional output of these sensors in order to unlock their potential. This paper provides a comprehensive overview of the emerging field of event-based vision, with a focus on the applications and the algorithms developed to unlock the outstanding properties of event cameras. We present event cameras from their working principle, the actual sensors that are available and the tasks that they have been used for, from low-level vision (feature detection and tracking, optic flow, etc.) to high-level vision (reconstruction, segmentation, recognition). We also discuss the techniques developed to process events, including learning-based techniques, as well as specialized processors for these novel sensors, such as spiking neural networks. Additionally, we highlight the challenges that remain to be tackled and the opportunities that lie ahead in the search for a more efficient, bio-inspired way for machines to perceive and interact with the world

    The Palomar Testbed Interferometer

    Get PDF
    The Palomar Testbed Interferometer (PTI) is a long-baseline infrared interferometer located at Palomar Observatory, California. It was built as a testbed for interferometric techniques applicable to the Keck Interferometer. First fringes were obtained in July 1995. PTI implements a dual-star architecture, tracking two stars simultaneously for phase referencing and narrow-angle astrometry. The three fixed 40-cm apertures can be combined pair-wise to provide baselines to 110 m. The interferometer actively tracks the white-light fringe using an array detector at 2.2 um and active delay lines with a range of +/- 38 m. Laser metrology of the delay lines allows for servo control, and laser metrology of the complete optical path enables narrow-angle astrometric measurements. The instrument is highly automated, using a multiprocessing computer system for instrument control and sequencing.Comment: ApJ in Press (Jan 99) Fig 1 available from http://huey.jpl.nasa.gov/~bode/ptiPicture.html, revised duging copy edi

    The Cosmic Background Imager

    Get PDF
    Design and performance details are given for the Cosmic Background Imager (CBI), an interferometer array that is measuring the power spectrum of fluctuations in the cosmic microwave background radiation (CMBR) for multipoles in the range 400 < l < 3500. The CBI is located at an altitude of 5000 m in the Atacama Desert in northern Chile. It is a planar synthesis array with 13 0.9-m diameter antennas on a 6-m diameter tracking platform. Each antenna has a cooled, low-noise receiver operating in the 26-36 GHz band. Signals are cross-correlated in an analog filterbank correlator with ten 1 GHz bands. This allows spectral index measurements which can be used to distinguish CMBR signals from diffuse galactic foregrounds. A 1.2 kHz 180-deg phase switching scheme is used to reject cross-talk and low-frequency pick-up in the signal processing system. The CBI has a 3-axis mount which allows the tracking platform to be rotated about the optical axis, providing improved (u,v) coverage and a powerful discriminant against false signals generated in the receiving electronics. Rotating the tracking platform also permits polarization measurements when some of the antennas are configured for the orthogonal polarization.Comment: 14 pages. Accepted for publication in PASP. See also http://www.astro.caltech.edu/~tjp/CBI
    corecore