5,228 research outputs found

    Reflectance Intensity Assisted Automatic and Accurate Extrinsic Calibration of 3D LiDAR and Panoramic Camera Using a Printed Chessboard

    Full text link
    This paper presents a novel method for fully automatic and convenient extrinsic calibration of a 3D LiDAR and a panoramic camera with a normally printed chessboard. The proposed method is based on the 3D corner estimation of the chessboard from the sparse point cloud generated by one frame scan of the LiDAR. To estimate the corners, we formulate a full-scale model of the chessboard and fit it to the segmented 3D points of the chessboard. The model is fitted by optimizing the cost function under constraints of correlation between the reflectance intensity of laser and the color of the chessboard's patterns. Powell's method is introduced for resolving the discontinuity problem in optimization. The corners of the fitted model are considered as the 3D corners of the chessboard. Once the corners of the chessboard in the 3D point cloud are estimated, the extrinsic calibration of the two sensors is converted to a 3D-2D matching problem. The corresponding 3D-2D points are used to calculate the absolute pose of the two sensors with Unified Perspective-n-Point (UPnP). Further, the calculated parameters are regarded as initial values and are refined using the Levenberg-Marquardt method. The performance of the proposed corner detection method from the 3D point cloud is evaluated using simulations. The results of experiments, conducted on a Velodyne HDL-32e LiDAR and a Ladybug3 camera under the proposed re-projection error metric, qualitatively and quantitatively demonstrate the accuracy and stability of the final extrinsic calibration parameters.Comment: 20 pages, submitted to the journal of Remote Sensin

    Real-Time Panoramic Tracking for Event Cameras

    Full text link
    Event cameras are a paradigm shift in camera technology. Instead of full frames, the sensor captures a sparse set of events caused by intensity changes. Since only the changes are transferred, those cameras are able to capture quick movements of objects in the scene or of the camera itself. In this work we propose a novel method to perform camera tracking of event cameras in a panoramic setting with three degrees of freedom. We propose a direct camera tracking formulation, similar to state-of-the-art in visual odometry. We show that the minimal information needed for simultaneous tracking and mapping is the spatial position of events, without using the appearance of the imaged scene point. We verify the robustness to fast camera movements and dynamic objects in the scene on a recently proposed dataset and self-recorded sequences.Comment: Accepted to International Conference on Computational Photography 201

    Geometrical Calibration for the Panrover: a Stereo Omnidirectional System for Planetary Rover

    Get PDF
    Abstract. A novel panoramic stereo imaging system is proposed in this paper. The system is able to carry out a 360° stereoscopic vision, useful for rover autonomous-driving, and capture simultaneously a high-resolution stereo scene. The core of the concept is a novel "bifocal panoramic lens" (BPL) based on hyper hemispheric model (Pernechele et al. 2016). This BPL is able to record a panoramic field of view (FoV) and, simultaneously, an area (belonging to the panoramic FoV) with a given degree of magnification by using a unique image sensor. This strategy makes possible to avoid rotational mechanisms. Using two BPLs settled in a vertical baseline (system called PANROVER) allows the monitoring of the surrounding environment in stereoscopic (3D) mode and, simultaneously, capturing an high-resolution stereoscopic images to analyse scientific cases, making it a new paradigm in the planetary rovers framework.Differently from the majority of the Mars systems which are based on rotational mechanisms for the acquisition of the panoramic images (mosaicked on ground), the PANROVER does not contain any moving components and can rescue a hi-rate stereo images of the context panorama.Scope of this work is the geometric calibration of the panoramic acquisition system by the omnidirectional calibration methods (Scaramuzza et al. 2006) based on Zhang calibration grid. The procedures are applied in order to obtain well rectified synchronized stereo images to be available for 3D reconstruction. We applied a Zhang chess boards based approach even during STC/SIMBIO-SYS stereo camera calibration (Simioni et al. 2014, 2017). In this case the target of the calibration will be the stereo heads (the BPLs) of the PANROVER with the scope of extracting the intrinsic parameters of the optical systems. Differently by previous pipelines, using the same data bench the estimate of the extrinsic parameters is performed

    3D modeling of indoor environments by a mobile platform with a laser scanner and panoramic camera

    Get PDF
    One major challenge of 3DTV is content acquisition. Here, we present a method to acquire a realistic, visually convincing D model of indoor environments based on a mobile platform that is equipped with a laser range scanner and a panoramic camera. The data of the 2D laser scans are used to solve the simultaneous lo- calization and mapping problem and to extract walls. Textures for walls and floor are built from the images of a calibrated panoramic camera. Multiresolution blending is used to hide seams in the gen- erated textures. The scene is further enriched by 3D-geometry cal- culated from a graph cut stereo technique. We present experimental results from a moderately large real environment.

    Omnidirectional underwater surveying and telepresence

    Get PDF
    Exploratory dives are traditionally the first step for marine scientists to acquire information on a previously unknown area of scientific interest. Manned submersibles have been the platform of choice for such exploration, as they allow a high level of environmental perception by the scientist on-board, and the ability to take informed decisions on what to explore next. However, manned submersibles have extremely high operation costs and provide very limited bottom time. Remotely operated vehicles (ROVs) can partially address these two issues, but have operational and cost constraints that restrict their usage. This paper discusses new capabilities to assist scientists operating lightweight hybrid remotely operated vehicles (HROV) in exploratory missions of mapping and surveying. The new capabilities, under development within the Spanish National project OMNIUS, provide a new layer of autonomy for HROVs by exploring three key concepts: Omni-directional optical sensing for collaborative immersive exploration, Proximity safety awareness and Online mapping during mission time.Peer Reviewe

    Beyond Gr\"obner Bases: Basis Selection for Minimal Solvers

    Full text link
    Many computer vision applications require robust estimation of the underlying geometry, in terms of camera motion and 3D structure of the scene. These robust methods often rely on running minimal solvers in a RANSAC framework. In this paper we show how we can make polynomial solvers based on the action matrix method faster, by careful selection of the monomial bases. These monomial bases have traditionally been based on a Gr\"obner basis for the polynomial ideal. Here we describe how we can enumerate all such bases in an efficient way. We also show that going beyond Gr\"obner bases leads to more efficient solvers in many cases. We present a novel basis sampling scheme that we evaluate on a number of problems
    • …
    corecore