5,747 research outputs found

    Fast measurement of diffusing patterns and its parameterization

    Get PDF
    When light strikes on a diffuser surface a new energy redistribution is created. The diffusion patterns depends on the material of the diffuser and how light reach the diffuser. The aim of this project is to characterize a simple and compact system that allows us to obtain transmission patterns of different diffusers in order to make a mathematical, experimental fitted, model of them. After define, and construct the experimental device a first characterization step is done; this steps covers a geometrical, photo-temporal and photo-spatial characterization of the system; with the experimental device well characterized a set of six different diffusers are analyzed obtaining the equations that describe their diffusion patterns

    WIDGET: System Performance and GRB Prompt Optical Observations

    Full text link
    The WIDeField telescope for Gamma-ray burst Early Timing (WIDGET) is used for a fully automated, ultra-wide-field survey aimed at detecting the prompt optical emission associated with Gamma-ray Bursts (GRBs). WIDGET surveys the HETE-2 and Swift/BAT pointing directions covering a total field of view of 62 degree x 62 degree every 10 secounds using an unfiltered system. This monitoring survey allows exploration of the optical emission before the gamma-ray trigger. The unfiltered magnitude is well converted to the SDSS r' system at a 0.1 mag level. Since 2004, WIDGET has made a total of ten simultaneous and one pre-trigger GRB observations. The efficiency of synchronized observation with HETE-2 is four times better than that of Swift. There has been no bright optical emission similar to that from GRB 080319B. The statistical analysis implies that GRB080319B is a rare event. This paper summarizes the design and operation of the WIDGET system and the simultaneous GRB observations obtained with this instrument.Comment: 19 pages, 11 figures, Accepted to appear in PAS

    Graph-Based Classification of Omnidirectional Images

    Get PDF
    Omnidirectional cameras are widely used in such areas as robotics and virtual reality as they provide a wide field of view. Their images are often processed with classical methods, which might unfortunately lead to non-optimal solutions as these methods are designed for planar images that have different geometrical properties than omnidirectional ones. In this paper we study image classification task by taking into account the specific geometry of omnidirectional cameras with graph-based representations. In particular, we extend deep learning architectures to data on graphs; we propose a principled way of graph construction such that convolutional filters respond similarly for the same pattern on different positions of the image regardless of lens distortions. Our experiments show that the proposed method outperforms current techniques for the omnidirectional image classification problem

    Calibration and removal of lateral chromatic aberration in images

    Get PDF
    This paper addresses the problem of compensating for lateral chromatic aberration in digital images through colour plane realignment. Two main contributions are made: the derivation of a model for lateral chromatic aberration in images, and the subsequent calibration of this model from a single view of a chess pattern. These advances lead to a practical and accurate alternative for the compensation of lateral chromatic aberrations. Experimental results validate the proposed models and calibration algorithm. The effects of colour channel correlations resulting from the camera colour filter array interpolation is examined and found to have a negligible magnitude relative to the chromatic aberration. Results with real data show how the removal of lateral chromatic aberration significantly improves the colour quality of the image

    Calibration Methods of Characterization Lens for Head Mounted Displays

    Get PDF
    This thesis concerns the calibration, characterization and utilization of the HMD Eye, OptoFidelity’s eye-mimicking optical camera system designed for the HMD IQ, a complete test station for near eye displays which are implemented in virtual and augmented reality systems. Its optical architecture provides a 120 degree field of view with high imaging performance and linear radial distortion, ideal for analysis of all possible object fields. HMD Eye has an external, mechanical entrance pupil that is of the same size as the human entrance pupil. Spatial frequency response (the modulation transfer function) has been used to develop sensor focus calibration methods and automation system plans. Geometrical distortion and its relation to the angular mapping function and imaging quality of the system are also considered. The nature of the user interface for human eyes, called the eyebox, and the optical properties of head mounted displays are reviewed. Head mounted displays consist usually of two near eye displays amongst other components, such as position tracking units. The HMD Eye enables looking inside the device from the eyebox and collecting optical signals (i.e. the virtual image) from the complete field of view of the device under test with a single image. The HMD Eye under inspection in this thesis is one of the ’zero’ batch, i.e. a test unit. The outcome of the calibration was that the HMD Eye unit in this thesis is focused to 1.6 m with an approximate error margin of ±10 cm. The drop of contrast reaches 50% approximately at angular frequency of 11 cycles/degree which is about 40% of the simulated values, prompting improvements in the mechanical design. Geometrical distortion results show that radial distortion is very linear (maximum error of 1%) and that tangential distortion has a diminishable effect (0.04 degrees of azimuth deviation at most) within the measurement region
    corecore