157 research outputs found

    Panoramic Stereovision and Scene Reconstruction

    Get PDF
    With advancement of research in robotics and computer vision, an increasingly high number of applications require the understanding of a scene in three dimensions. A variety of systems are deployed to do the same. This thesis explores a novel 3D imaging technique. This involves the use of catadioptric cameras in a stereoscopic arrangement. A secondary system aims to stabilize the system in the event that the cameras are misaligned during operation. The system provides a stark advantage due to it being a cost effective alternative to present day standard state-of-the-art systems that achieve the same goal of 3D imaging. The compromise lies in the quality of depth estimation, which can be overcome with a different imager and calibration. The result was a panoramic disparity map generated by the system

    Omnidirectional Stereo Vision for Autonomous Vehicles

    Get PDF
    Environment perception with cameras is an important requirement for many applications for autonomous vehicles and robots. This work presents a stereoscopic omnidirectional camera system for autonomous vehicles which resolves the problem of a limited field of view and provides a 360° panoramic view of the environment. We present a new projection model for these cameras and show that the camera setup overcomes major drawbacks of traditional perspective cameras in many applications

    Enhancing 3D Visual Odometry with Single-Camera Stereo Omnidirectional Systems

    Full text link
    We explore low-cost solutions for efficiently improving the 3D pose estimation problem of a single camera moving in an unfamiliar environment. The visual odometry (VO) task -- as it is called when using computer vision to estimate egomotion -- is of particular interest to mobile robots as well as humans with visual impairments. The payload capacity of small robots like micro-aerial vehicles (drones) requires the use of portable perception equipment, which is constrained by size, weight, energy consumption, and processing power. Using a single camera as the passive sensor for the VO task satisfies these requirements, and it motivates the proposed solutions presented in this thesis. To deliver the portability goal with a single off-the-shelf camera, we have taken two approaches: The first one, and the most extensively studied here, revolves around an unorthodox camera-mirrors configuration (catadioptrics) achieving a stereo omnidirectional system (SOS). The second approach relies on expanding the visual features from the scene into higher dimensionalities to track the pose of a conventional camera in a photogrammetric fashion. The first goal has many interdependent challenges, which we address as part of this thesis: SOS design, projection model, adequate calibration procedure, and application to VO. We show several practical advantages for the single-camera SOS due to its complete 360-degree stereo views, that other conventional 3D sensors lack due to their limited field of view. Since our omnidirectional stereo (omnistereo) views are captured by a single camera, a truly instantaneous pair of panoramic images is possible for 3D perception tasks. Finally, we address the VO problem as a direct multichannel tracking approach, which increases the pose estimation accuracy of the baseline method (i.e., using only grayscale or color information) under the photometric error minimization as the heart of the “direct” tracking algorithm. Currently, this solution has been tested on standard monocular cameras, but it could also be applied to an SOS. We believe the challenges that we attempted to solve have not been considered previously with the level of detail needed for successfully performing VO with a single camera as the ultimate goal in both real-life and simulated scenes

    Vision Sensors and Edge Detection

    Get PDF
    Vision Sensors and Edge Detection book reflects a selection of recent developments within the area of vision sensors and edge detection. There are two sections in this book. The first section presents vision sensors with applications to panoramic vision sensors, wireless vision sensors, and automated vision sensor inspection, and the second one shows image processing techniques, such as, image measurements, image transformations, filtering, and parallel computing

    Geometrical Calibration for the Panrover: a Stereo Omnidirectional System for Planetary Rover

    Get PDF
    Abstract. A novel panoramic stereo imaging system is proposed in this paper. The system is able to carry out a 360° stereoscopic vision, useful for rover autonomous-driving, and capture simultaneously a high-resolution stereo scene. The core of the concept is a novel "bifocal panoramic lens" (BPL) based on hyper hemispheric model (Pernechele et al. 2016). This BPL is able to record a panoramic field of view (FoV) and, simultaneously, an area (belonging to the panoramic FoV) with a given degree of magnification by using a unique image sensor. This strategy makes possible to avoid rotational mechanisms. Using two BPLs settled in a vertical baseline (system called PANROVER) allows the monitoring of the surrounding environment in stereoscopic (3D) mode and, simultaneously, capturing an high-resolution stereoscopic images to analyse scientific cases, making it a new paradigm in the planetary rovers framework.Differently from the majority of the Mars systems which are based on rotational mechanisms for the acquisition of the panoramic images (mosaicked on ground), the PANROVER does not contain any moving components and can rescue a hi-rate stereo images of the context panorama.Scope of this work is the geometric calibration of the panoramic acquisition system by the omnidirectional calibration methods (Scaramuzza et al. 2006) based on Zhang calibration grid. The procedures are applied in order to obtain well rectified synchronized stereo images to be available for 3D reconstruction. We applied a Zhang chess boards based approach even during STC/SIMBIO-SYS stereo camera calibration (Simioni et al. 2014, 2017). In this case the target of the calibration will be the stereo heads (the BPLs) of the PANROVER with the scope of extracting the intrinsic parameters of the optical systems. Differently by previous pipelines, using the same data bench the estimate of the extrinsic parameters is performed

    Panomoprh Based Panoramic Vision Sensors

    Get PDF

    Vision-based Navigation and Mapping Using Non-central Catadioptric Omnidirectional Camera

    Get PDF
    Omnidirectional catadioptric cameras find their use in navigation and mapping, owing to their wide field of view. Having a wider field of view, or rather a potential 360 degree field of view, allows the user to see and move more freely in the navigation space. A catadioptric camera system is a low cost system which consists of a mirror and a camera. A calibration method was developed in order to obtain the relative position and orientation between the two components so that they can be considered as one monolithic system. The position of the system was determined, for an environment using the conditions obtained from the reflective properties of the mirror. Object control points were set up and experiments were performed at different sites to test the mathematical models and the achieved location and mapping accuracy of the system. The obtained positions were then used to map the environment
    corecore