3,292 research outputs found

    Form Factor Improvement of Smart-Pixels for Vision Sensors through 3-D Vertically- Integrated Technologies

    Get PDF
    While conventional CMOS active pixel sensors embed only the circuitry required for photo-detection, pixel addressing and voltage buffering, smart pixels incorporate also circuitry for data processing, data storage and control of data interchange. This additional circuitry enables data processing be realized concurrently with the acquisition of images which is instrumental to reduce the number of data needed to carry to information contained into images. This way, more efficient vision systems can be built at the cost of larger pixel pitch. Vertically-integrated 3D technologies enable to keep the advnatges of smart pixels while improving the form factor of smart pixels.Office of Naval Research N000141110312Ministerio de Ciencia e InnovaciĂłn IPT-2011-1625-43000

    Form factor improvement of smart-pixels for vision sensors through 3-D vertically-integrated technologies

    Full text link

    Smart Bolometer: Toward Monolithic Bolometer with Smart Functions

    Get PDF
    The content of this chapter refers to uncooled resistive bolometers amd the challenge that consists in their integration into monolithic devices exhibiting smart functions. Uncooled resistive bolometers are the essential constitutive element of the majority of existing uncooled infrared imaging systems; they are referred to as microbolometer pixels in that type of application where matrixes of such elementary devices are used. uncooled bolometers represent more than 95% of the market of infrared imaging systems in 2010 (yole 2010) and infrared imaging systems are required for more and more applications

    A power-saving modulation technique for time-of-flight range imaging sensors

    Get PDF
    Time-of-flight range imaging cameras measure distance and intensity simultaneously for every pixel in an image. With the continued advancement of the technology, a wide variety of new depth sensing applications are emerging; however a number of these potential applications have stringent electrical power constraints that are difficult to meet with the current state-of-the-art systems. Sensor gain modulation contributes a significant proportion of the total image sensor power consumption, and as higher spatial resolution range image sensors operating at higher modulation frequencies (to achieve better measurement precision) are developed, this proportion is likely to increase. The authors have developed a new sensor modulation technique using resonant circuit concepts that is more power efficient than the standard mode of operation. With a proof of principle system, a 93–96% reduction in modulation drive power was demonstrated across a range of modulation frequencies from 1–11 MHz. Finally, an evaluation of the range imaging performance revealed an improvement in measurement linearity in the resonant configuration due primarily to the more sinusoidal shape of the resonant electrical waveforms, while the average precision values were comparable between the standard and resonant operating modes

    CMOS Photodetectors

    Get PDF

    MusA: Using Indoor Positioning and Navigation to Enhance Cultural Experiences in a museum

    Get PDF
    In recent years there has been a growing interest into the use of multimedia mobile guides in museum environments. Mobile devices have the capabilities to detect the user context and to provide pieces of information suitable to help visitors discovering and following the logical and emotional connections that develop during the visit. In this scenario, location based services (LBS) currently represent an asset, and the choice of the technology to determine users' position, combined with the definition of methods that can effectively convey information, become key issues in the design process. In this work, we present MusA (Museum Assistant), a general framework for the development of multimedia interactive guides for mobile devices. Its main feature is a vision-based indoor positioning system that allows the provision of several LBS, from way-finding to the contextualized communication of cultural contents, aimed at providing a meaningful exploration of exhibits according to visitors' personal interest and curiosity. Starting from the thorough description of the system architecture, the article presents the implementation of two mobile guides, developed to respectively address adults and children, and discusses the evaluation of the user experience and the visitors' appreciation of these application

    A 64mW DNN-based Visual Navigation Engine for Autonomous Nano-Drones

    Full text link
    Fully-autonomous miniaturized robots (e.g., drones), with artificial intelligence (AI) based visual navigation capabilities are extremely challenging drivers of Internet-of-Things edge intelligence capabilities. Visual navigation based on AI approaches, such as deep neural networks (DNNs) are becoming pervasive for standard-size drones, but are considered out of reach for nanodrones with size of a few cm2{}^\mathrm{2}. In this work, we present the first (to the best of our knowledge) demonstration of a navigation engine for autonomous nano-drones capable of closed-loop end-to-end DNN-based visual navigation. To achieve this goal we developed a complete methodology for parallel execution of complex DNNs directly on-bard of resource-constrained milliwatt-scale nodes. Our system is based on GAP8, a novel parallel ultra-low-power computing platform, and a 27 g commercial, open-source CrazyFlie 2.0 nano-quadrotor. As part of our general methodology we discuss the software mapping techniques that enable the state-of-the-art deep convolutional neural network presented in [1] to be fully executed on-board within a strict 6 fps real-time constraint with no compromise in terms of flight results, while all processing is done with only 64 mW on average. Our navigation engine is flexible and can be used to span a wide performance range: at its peak performance corner it achieves 18 fps while still consuming on average just 3.5% of the power envelope of the deployed nano-aircraft.Comment: 15 pages, 13 figures, 5 tables, 2 listings, accepted for publication in the IEEE Internet of Things Journal (IEEE IOTJ
    • 

    corecore