9 research outputs found

    Imaging Based Beam Steering for Optical Communication and Lidar Applications

    Get PDF
    Optical beam steering is a key component in any application that requires dynamic (i.e. realtime control) of beam propagation through free-space. Example applications include remote sensing, spectroscopy, laser machining, targeting, Lidar, optical wireless communications (OWC) and more. The pointing control requirements for many of these applications can be met by traditional mechanical steering techniques; however, these solutions tend to be bulky, slow, expensive, power hungry and prone to mechanical failures leading to short component lifetimes. Two emerging applications, Lidar imaging and OWC, truly need improved beam-steering capabilities to flourish and support the advancement of self-driving cars or relieve the congestion in radio-frequency wireless networks, respectively. We consider the novel requirements of these applications during development of a new beam-steering technology. We introduce imaging-based beam steering (IBBS) that uses an imaging transform between spatial and directional domains to implement a new method of electronic beam-steering. We introduce this concept while focusing on transmitters (Tx) for OWC but the pointing control mechanism is bi-directional supporting both transmit and receive functionality, even out of the same aperture; likewise, features that make this solution compelling for OWC are also great for Lidar imaging. In IBBS, an array of high-speed sources are positioned at the focal plane of a lens and the lens passively collects, collimates and steers the beam into a conjugate direction. Steering is accomplished by selecting which source to use for an OWC link. This gives a coarse, pixelated beam-steering control that is well-suited for short-range OWC such as indoor communications and we present a prototype bulb for this application. Notably, multiple sources can be utilized at once with each steered into its conjugate directions and this presents the first beam-steering technology that supports multiple beams out of a single aperture; this feature uniquely supports multiplexed communications and fast, high-resolution Lidar imaging

    Design of large polyphase filters in the Quadratic Residue Number System

    Full text link

    VRCodes : embedding unobtrusive data for new devices in visible light

    Get PDF
    Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2012.Cataloged from PDF version of thesis.Includes bibliographical references (p. 97-101).This thesis envisions a public space populated with active visible surfaces which appear different to a camera than to the human eye. Thus, they can act as general digital interfaces that transmit machine-compatible data as well as provide relative orientation without being obtrusive. We introduce a personal transceiver peripheral, and demonstrate this visual environment enables human participants to hear sound only from the location they are looking in, authenticate with proximal surfaces, and gather otherwise imperceptible data from an object in sight. We present a design methodology that assumes the availability of many independent and controllable light transmitters where each individual transmitter produces light at different color wavelengths. Today, controllable light transmitters take the form of digital billboards, signage and overhead lighting built for human use; light-capturing receivers take the form of mobile cameras and personal video camcorders. Following the software-defined approach, we leverage screens and cameras as parameterized hardware peripherals thus allowing flexibility and development of the proposed framework on general-purpose computers in a manner that is unobtrusive to humans. We develop VRCodes which display spatio-temporally modulated metamers on active screens thus conveying digital and positional information to a rolling-shutter camera; and physically-modified optical setups which encode data in a point-spread function thus exploiting the camera's wide-aperture. These techniques exploit how the camera sees something different from the human. We quantify the full potential of the system by characterizing basic bounds of a parameterized transceiver hardware along with the medium in which it operates. Evaluating performance highlights the underutilized temporal, spatial and frequency dimensions available to the interaction designer concerned with human perception. Results suggest that the one-way point-to-point transmission is good enough for extending the techniques toward a two-way bidrectional model with realizable hardware devices. The new visual environment contains a second data layer for machines that is synthetic and quantifiable; human interactions serve as the context.by Grace Woo.Ph.D

    Temperature aware power optimization for multicore floating-point units

    Full text link

    Manipulador aéreo con brazos antropomórficos de articulaciones flexibles

    Get PDF
    [Resumen] Este artículo presenta el primer robot manipulador aéreo con dos brazos antropomórficos diseñado para aplicarse en tareas de inspección y mantenimiento en entornos industriales de difícil acceso para operarios humanos. El robot consiste en una plataforma aérea multirrotor equipada con dos brazos antropomórficos ultraligeros, así como el sistema de control integrado de la plataforma y los brazos. Una de las principales características del manipulador es la flexibilidad mecánica proporcionada en todas las articulaciones, lo que aumenta la seguridad en las interacciones físicas con el entorno y la protección del propio robot. Para ello se ha introducido un compacto y simple mecanismo de transmisión por muelle entre el eje del servo y el enlace de salida. La estructura en aluminio de los brazos ha sido cuidadosamente diseñada de forma que los actuadores estén aislados frente a cargas radiales y axiales que los puedan dañar. El manipulador desarrollado ha sido validado a través de experimentos en base fija y en pruebas de vuelo en exteriores.Ministerio de Economía y Competitividad; DPI2014-5983-C2-1-
    corecore