5,314 research outputs found

    Camera distortion self-calibration using the plumb-line constraint and minimal Hough entropy

    Full text link
    In this paper we present a simple and robust method for self-correction of camera distortion using single images of scenes which contain straight lines. Since the most common distortion can be modelled as radial distortion, we illustrate the method using the Harris radial distortion model, but the method is applicable to any distortion model. The method is based on transforming the edgels of the distorted image to a 1-D angular Hough space, and optimizing the distortion correction parameters which minimize the entropy of the corresponding normalized histogram. Properly corrected imagery will have fewer curved lines, and therefore less spread in Hough space. Since the method does not rely on any image structure beyond the existence of edgels sharing some common orientations and does not use edge fitting, it is applicable to a wide variety of image types. For instance, it can be applied equally well to images of texture with weak but dominant orientations, or images with strong vanishing points. Finally, the method is performed on both synthetic and real data revealing that it is particularly robust to noise.Comment: 9 pages, 5 figures Corrected errors in equation 1

    First Evaluation of the CPU, GPGPU and MIC Architectures for Real Time Particle Tracking based on Hough Transform at the LHC

    Full text link
    Recent innovations focused around {\em parallel} processing, either through systems containing multiple processors or processors containing multiple cores, hold great promise for enhancing the performance of the trigger at the LHC and extending its physics program. The flexibility of the CMS/ATLAS trigger system allows for easy integration of computational accelerators, such as NVIDIA's Tesla Graphics Processing Unit (GPU) or Intel's \xphi, in the High Level Trigger. These accelerators have the potential to provide faster or more energy efficient event selection, thus opening up possibilities for new complex triggers that were not previously feasible. At the same time, it is crucial to explore the performance limits achievable on the latest generation multicore CPUs with the use of the best software optimization methods. In this article, a new tracking algorithm based on the Hough transform will be evaluated for the first time on a multi-core Intel Xeon E5-2697v2 CPU, an NVIDIA Tesla K20c GPU, and an Intel \xphi\ 7120 coprocessor. Preliminary time performance will be presented.Comment: 13 pages, 4 figures, Accepted to JINS

    Massively Parallel Computing and the Search for Jets and Black Holes at the LHC

    Full text link
    Massively parallel computing at the LHC could be the next leap necessary to reach an era of new discoveries at the LHC after the Higgs discovery. Scientific computing is a critical component of the LHC experiment, including operation, trigger, LHC computing GRID, simulation, and analysis. One way to improve the physics reach of the LHC is to take advantage of the flexibility of the trigger system by integrating coprocessors based on Graphics Processing Units (GPUs) or the Many Integrated Core (MIC) architecture into its server farm. This cutting edge technology provides not only the means to accelerate existing algorithms, but also the opportunity to develop new algorithms that select events in the trigger that previously would have evaded detection. In this article we describe new algorithms that would allow to select in the trigger new topological signatures that include non-prompt jet and black hole--like objects in the silicon tracker.Comment: 15 pages, 11 figures, submitted to NIM

    Fully coherent follow-up of continuous gravitational-wave candidates: an application to Einstein@Home results

    Full text link
    We characterize and present the details of the follow-up method used on the most significant outliers of the Hough Einstein@Home all-sky search for continuous gravitational waves arXiv:1207.7176. This follow-up method is based on the two-stage approach introduced in arXiv:1303.2471, consisting of a semicoherent refinement followed by a fully coherent zoom. We quantify the efficiency of the follow-up pipeline using simulated signals in Gaussian noise. This pipeline does not search beyond first-order frequency spindown, and therefore we also evaluate its robustness against second-order spindown. We present the details of the Hough Einstein@Home follow-up (arXiv:1207.7176) on three hardware-injected signals and on the 8 most significant outliers of unknown origin.Comment: 8 pages, 3 figures, 3 table

    3D Reconstruction with Uncalibrated Cameras Using the Six-Line Conic Variety

    Full text link
    We present new algorithms for the recovery of the Euclidean structure from a projective calibration of a set of cameras with square pixels but otherwise arbitrarily varying intrinsic and extrinsic parameters. Our results, based on a novel geometric approach, include a closed-form solution for the case of three cameras and two known vanishing points and an efficient one-dimensional search algorithm for the case of four cameras and one known vanishing point. In addition, an algorithm for a reliable automatic detection of vanishing points on the images is presented. These techniques fit in a 3D reconstruction scheme oriented to urban scenes reconstruction. The satisfactory performance of the techniques is demonstrated with tests on synthetic and real data
    • …
    corecore