98,758 research outputs found

    Real-Time Panoramic Tracking for Event Cameras

    Full text link
    Event cameras are a paradigm shift in camera technology. Instead of full frames, the sensor captures a sparse set of events caused by intensity changes. Since only the changes are transferred, those cameras are able to capture quick movements of objects in the scene or of the camera itself. In this work we propose a novel method to perform camera tracking of event cameras in a panoramic setting with three degrees of freedom. We propose a direct camera tracking formulation, similar to state-of-the-art in visual odometry. We show that the minimal information needed for simultaneous tracking and mapping is the spatial position of events, without using the appearance of the imaged scene point. We verify the robustness to fast camera movements and dynamic objects in the scene on a recently proposed dataset and self-recorded sequences.Comment: Accepted to International Conference on Computational Photography 201

    Guest Editorial Computational and smart cameras

    Get PDF
    published_or_final_versio

    Multi-aperture foveated imaging

    Get PDF
    Foveated imaging, such as that evolved by biological systems to provide high angular resolution with a reduced space–bandwidth product, also offers advantages for man-made task-specific imaging. Foveated imaging systems using exclusively optical distortion are complex, bulky, and high cost, however. We demonstrate foveated imaging using a planar array of identical cameras combined with a prism array and superresolution reconstruction of a mosaicked image with a foveal variation in angular resolution of 5.9:1 and a quadrupling of the field of view. The combination of low-cost, mass-produced cameras and optics with computational image recovery offers enhanced capability of achieving large foveal ratios from compact, low-cost imaging systems

    Video-rate computational super-resolution and integral imaging at longwave-infrared wavelengths

    Get PDF
    We report the first computational super-resolved, multi-camera integral imaging at long-wave infrared (LWIR) wavelengths. A synchronized array of FLIR Lepton cameras was assembled, and computational super-resolution and integral-imaging reconstruction employed to generate video with light-field imaging capabilities, such as 3D imaging and recognition of partially obscured objects, while also providing a four-fold increase in effective pixel count. This approach to high-resolution imaging enables a fundamental reduction in the track length and volume of an imaging system, while also enabling use of low-cost lens materials.Comment: Supplementary multimedia material in http://dx.doi.org/10.6084/m9.figshare.530302

    Parametrizable cameras for 3D computational steering

    Get PDF
    We present a method for the definition of multiple views in 3D interfaces for computational steering. The method uses the concept of a point-based parametrizable camera object. This concept enables a user to create and configure multiple views on his custom 3D interface in an intuitive graphical manner. Each view can be coupled to objects present in the interface, parametrized to (simulation) data, or adjusted through direct manipulation or user defined camera controls. Although our focus is on 3D interfaces for computational steering, we think that the concept is valuable for many other 3D graphics applications as well

    Autonomous navigation for artificial satellites

    Get PDF
    An autonomous navigation system is considered that provides a satellite with sufficient numbers and types of sensors, as well as computational hardware and software, to enable it to track itself. Considered are attitude type sensors, meteorological cameras and scanners, one way Doppler, and image correlator
    • …
    corecore