23,868 research outputs found

    Computation of Smooth Optical Flow in a Feedback Connected Analog Network

    Get PDF
    In 1986, Tanner and Mead \cite{Tanner_Mead86} implemented an interesting constraint satisfaction circuit for global motion sensing in aVLSI. We report here a new and improved aVLSI implementation that provides smooth optical flow as well as global motion in a two dimensional visual field. The computation of optical flow is an ill-posed problem, which expresses itself as the aperture problem. However, the optical flow can be estimated by the use of regularization methods, in which additional constraints are introduced in terms of a global energy functional that must be minimized. We show how the algorithmic constraints of Horn and Schunck \cite{Horn_Schunck81} on computing smooth optical flow can be mapped onto the physical constraints of an equivalent electronic network

    Integrated 2-D Optical Flow Sensor

    Get PDF
    I present a new focal-plane analog VLSI sensor that estimates optical flow in two visual dimensions. The chip significantly improves previous approaches both with respect to the applied model of optical flow estimation as well as the actual hardware implementation. Its distributed computational architecture consists of an array of locally connected motion units that collectively solve for the unique optimal optical flow estimate. The novel gradient-based motion model assumes visual motion to be translational, smooth and biased. The model guarantees that the estimation problem is computationally well-posed regardless of the visual input. Model parameters can be globally adjusted, leading to a rich output behavior. Varying the smoothness strength, for example, can provide a continuous spectrum of motion estimates, ranging from normal to global optical flow. Unlike approaches that rely on the explicit matching of brightness edges in space or time, the applied gradient-based model assures spatiotemporal continuity on visual information. The non-linear coupling of the individual motion units improves the resulting optical flow estimate because it reduces spatial smoothing across large velocity differences. Extended measurements of a 30x30 array prototype sensor under real-world conditions demonstrate the validity of the model and the robustness and functionality of the implementation

    Event-based Vision: A Survey

    Get PDF
    Event cameras are bio-inspired sensors that differ from conventional frame cameras: Instead of capturing images at a fixed rate, they asynchronously measure per-pixel brightness changes, and output a stream of events that encode the time, location and sign of the brightness changes. Event cameras offer attractive properties compared to traditional cameras: high temporal resolution (in the order of microseconds), very high dynamic range (140 dB vs. 60 dB), low power consumption, and high pixel bandwidth (on the order of kHz) resulting in reduced motion blur. Hence, event cameras have a large potential for robotics and computer vision in challenging scenarios for traditional cameras, such as low-latency, high speed, and high dynamic range. However, novel methods are required to process the unconventional output of these sensors in order to unlock their potential. This paper provides a comprehensive overview of the emerging field of event-based vision, with a focus on the applications and the algorithms developed to unlock the outstanding properties of event cameras. We present event cameras from their working principle, the actual sensors that are available and the tasks that they have been used for, from low-level vision (feature detection and tracking, optic flow, etc.) to high-level vision (reconstruction, segmentation, recognition). We also discuss the techniques developed to process events, including learning-based techniques, as well as specialized processors for these novel sensors, such as spiking neural networks. Additionally, we highlight the challenges that remain to be tackled and the opportunities that lie ahead in the search for a more efficient, bio-inspired way for machines to perceive and interact with the world

    An improved 2D optical flow sensor for motion segmentation

    Get PDF
    A functional focal-plane implementation of a 2D optical flow system is presented that detects an preserves motion discontinuities. The system is composed of two different network layers of analog computational units arranged in a retinotopical order. The units in the first layer (the optical flow network) estimate the local optical flow field in two visual dimensions, where the strength of their nearest-neighbor connections determines the amount of motion integration. Whereas in an earlier implementation \cite{Stocker_Douglas99} the connection strength was set constant in the complete image space, it is now \emph{dynamically and locally} controlled by the second network layer (the motion discontinuities network) that is recurrently connected to the optical flow network. The connection strengths in the optical flow network are modulated such that visual motion integration is ideally only facilitated within image areas that are likely to represent common motion sources. Results of an experimental aVLSI chip illustrate the potential of the approach and its functionality under real-world conditions

    Lensless high-resolution on-chip optofluidic microscopes for Caenorhabditis elegans and cell imaging

    Get PDF
    Low-cost and high-resolution on-chip microscopes are vital for reducing cost and improving efficiency for modern biomedicine and bioscience. Despite the needs, the conventional microscope design has proven difficult to miniaturize. Here, we report the implementation and application of two high-resolution (≈0.9 μm for the first and ≈0.8 μm for the second), lensless, and fully on-chip microscopes based on the optofluidic microscopy (OFM) method. These systems abandon the conventional microscope design, which requires expensive lenses and large space to magnify images, and instead utilizes microfluidic flow to deliver specimens across array(s) of micrometer-size apertures defined on a metal-coated CMOS sensor to generate direct projection images. The first system utilizes a gravity-driven microfluidic flow for sample scanning and is suited for imaging elongate objects, such as Caenorhabditis elegans; and the second system employs an electrokinetic drive for flow control and is suited for imaging cells and other spherical/ellipsoidal objects. As a demonstration of the OFM for bioscience research, we show that the prototypes can be used to perform automated phenotype characterization of different Caenorhabditis elegans mutant strains, and to image spores and single cellular entities. The optofluidic microscope design, readily fabricable with existing semiconductor and microfluidic technologies, offers low-cost and highly compact imaging solutions. More functionalities, such as on-chip phase and fluorescence imaging, can also be readily adapted into OFM systems. We anticipate that the OFM can significantly address a range of biomedical and bioscience needs, and engender new microscope applications

    Principles of Neuromorphic Photonics

    Full text link
    In an age overrun with information, the ability to process reams of data has become crucial. The demand for data will continue to grow as smart gadgets multiply and become increasingly integrated into our daily lives. Next-generation industries in artificial intelligence services and high-performance computing are so far supported by microelectronic platforms. These data-intensive enterprises rely on continual improvements in hardware. Their prospects are running up against a stark reality: conventional one-size-fits-all solutions offered by digital electronics can no longer satisfy this need, as Moore's law (exponential hardware scaling), interconnection density, and the von Neumann architecture reach their limits. With its superior speed and reconfigurability, analog photonics can provide some relief to these problems; however, complex applications of analog photonics have remained largely unexplored due to the absence of a robust photonic integration industry. Recently, the landscape for commercially-manufacturable photonic chips has been changing rapidly and now promises to achieve economies of scale previously enjoyed solely by microelectronics. The scientific community has set out to build bridges between the domains of photonic device physics and neural networks, giving rise to the field of \emph{neuromorphic photonics}. This article reviews the recent progress in integrated neuromorphic photonics. We provide an overview of neuromorphic computing, discuss the associated technology (microelectronic and photonic) platforms and compare their metric performance. We discuss photonic neural network approaches and challenges for integrated neuromorphic photonic processors while providing an in-depth description of photonic neurons and a candidate interconnection architecture. We conclude with a future outlook of neuro-inspired photonic processing.Comment: 28 pages, 19 figure
    corecore