315 research outputs found

    Visibility computation through image generalization

    Get PDF
    This dissertation introduces the image generalization paradigm for computing visibility. The paradigm is based on the observation that an image is a powerful tool for computing visibility. An image can be rendered efficiently with the support of graphics hardware and each of the millions of pixels in the image reports a visible geometric primitive. However, the visibility solution computed by a conventional image is far from complete. A conventional image has a uniform sampling rate which can miss visible geometric primitives with a small screen footprint. A conventional image can only find geometric primitives to which there is direct line of sight from the center of projection (i.e. the eye) of the image; therefore, a conventional image cannot compute the set of geometric primitives that become visible as the viewpoint translates, or as time changes in a dynamic dataset. Finally, like any sample-based representation, a conventional image can only confirm that a geometric primitive is visible, but it cannot confirm that a geometric primitive is hidden, as that would require an infinite number of samples to confirm that the primitive is hidden at all of its points. ^ The image generalization paradigm overcomes the visibility computation limitations of conventional images. The paradigm has three elements. (1) Sampling pattern generalization entails adding sampling locations to the image plane where needed to find visible geometric primitives with a small footprint. (2) Visibility sample generalization entails replacing the conventional scalar visibility sample with a higher dimensional sample that records all geometric primitives visible at a sampling location as the viewpoint translates or as time changes in a dynamic dataset; the higher-dimensional visibility sample is computed exactly, by solving visibility event equations, and not through sampling. Another form of visibility sample generalization is to enhance a sample with its trajectory as the geometric primitive it samples moves in a dynamic dataset. (3) Ray geometry generalization redefines a camera ray as the set of 3D points that project at a given image location; this generalization supports rays that are not straight lines, and enables designing cameras with non-linear rays that circumvent occluders to gather samples not visible from a reference viewpoint. ^ The image generalization paradigm has been used to develop visibility algorithms for a variety of datasets, of visibility parameter domains, and of performance-accuracy tradeoff requirements. These include an aggressive from-point visibility algorithm that guarantees finding all geometric primitives with a visible fragment, no matter how small primitive\u27s image footprint, an efficient and robust exact from-point visibility algorithm that iterates between a sample-based and a continuous visibility analysis of the image plane to quickly converge to the exact solution, a from-rectangle visibility algorithm that uses 2D visibility samples to compute a visible set that is exact under viewpoint translation, a flexible pinhole camera that enables local modulations of the sampling rate over the image plane according to an input importance map, an animated depth image that not only stores color and depth per pixel but also a compact representation of pixel sample trajectories, and a curved ray camera that integrates seamlessly multiple viewpoints into a multiperspective image without the viewpoint transition distortion artifacts of prior art methods

    Towards optical intensity interferometry for high angular resolution stellar astrophysics

    Full text link
    Most neighboring stars are still detected as point sources and are beyond the angular resolution reach of current observatories. Methods to improve our understanding of stars at high angular resolution are investigated. Air Cherenkov telescopes (ACTs), primarily used for Gamma-ray astronomy, enable us to increase our understanding of the circumstellar environment of a particular system. When used as optical intensity interferometers, future ACT arrays will allow us to detect stars as extended objects and image their surfaces at high angular resolution. Optical stellar intensity interferometry (SII) with ACT arrays, composed of nearly 100 telescopes, will provide means to measure fundamental stellar parameters and also open the possibility of model-independent imaging. A data analysis algorithm is developed and permits the reconstruction of high angular resolution images from simulated SII data. The capabilities and limitations of future ACT arrays used for high angular resolution imaging are investigated via Monte-Carlo simulations. Simple stellar objects as well as stellar surfaces with localized hot or cool regions can be accurately imaged. Finally, experimental efforts to measure intensity correlations are expounded. The functionality of analog and digital correlators is demonstrated. Intensity correlations have been measured for a simulated star emitting pseudo-thermal light, resulting in angular diameter measurements. The StarBase observatory, consisting of a pair of 3 m telescopes separated by 23 m, is described.Comment: PhD dissertatio

    Relevance of accurate Monte Carlo modeling in nuclear medical imaging

    Get PDF
    Monte Carlo techniques have become popular in different areas of medical physics with advantage of powerful computing systems. In particular, they have been extensively applied to simulate processes involving random behavior and to quantify physical parameters that are difficult or even impossible to calculate by experimental measurements. Recent nuclear medical imaging innovations such as single-photon emission computed tomography (SPECT), positron emission tomography (PET), and multiple emission tomography (MET) are ideal for Monte Carlo modeling techniques because of the stochastic nature of radiation emission, transport and detection processes. Factors which have contributed to the wider use include improved models of radiation transport processes, the practicality of application with the development of acceleration schemes and the improved speed of computers. This paper presents derivation and methodological basis for this approach and critically reviews their areas of application in nuclear imaging. An overview of existing simulation programs is provided and illustrated with examples of some useful features of such sophisticated tools in connection with common computing facilities and more powerful multiple-processor parallel processing systems. Current and future trends in the field are also discussed

    Computer vision and optimization methods applied to the measurements of in-plane deformations

    Get PDF
    fi=vertaisarvioitu|en=peerReviewed

    Improved velocity data in circular jets using an avalanche photodiode-based 2-component point Doppler velocimeter

    Get PDF
    An existing Point Doppler Velocimeter (pDv) has been modified in an effort to improve the RMS velocity results. Improvements have been made by reducing the probe volume size by focusing the incident laser beam, thereby reducing the effects of spatial averaging on the mean and RMS velocity measurements.;In particular, the PIN photodetectors used previously, were tested using the reduced probe volume, and were then replaced by high-gain, high signal-to-noise ratio large area avalanche photodetectors (APD\u27s).;Comparisons have been made between the data acquired and hot wire data obtained on the same jet flow, as well as data obtained in previous pDv research and two theoretical profiles. Good agreement was found with the mean and RMS velocity hot wire results at the exit of the jet, and the RMS velocities downstream have been improved as well. (Abstract shortened by UMI.)

    Holistic simulation of optical systems

    Get PDF
    For many years, the design of optical systems mainly comprised a linear arrangement of plane or spherical components, such as lenses, mirrors or prisms, and a geometric-optical description by ray tracing lead to an accurate and satisfactory result. Today, many modern optical systems found in a variety of different industrial and scientific applications, deviate from this structure. Polarization, diffraction and coherence, or material interactions, such as volume or surface scattering, need to be included when reasonable performance predictions are required. Furthermore, manufacturing and alignment aspects must be considered in the design and simulation of optical systems to ensure that their impact is not damaging to the overall purpose of the corresponding setup. Another important part is the growing field of digital optics. Signal processing algorithms have become an indispensable part of many systems, whereby an almost unlimited number of current and potential applications exists. Since these algorithms are an essential part of the system, their compatibility and impact on the completed system is an important aspect to con- sider. In principle, this list of relevant topics and examples can be further expanded to an almost unlimited extend. However, the simulation and optimization of the single sub-aspects do often not lead to a satisfactory result. The goal of this thesis is to demonstrate that the performance prediction of modern optical systems benefits significantly from an aggregation of the individual models and technological aspects. Present concepts are further enhanced by the development and analysis of new approaches and algorithms, leading to a more holistic description and simulation of complex setups as a whole. The long-term objective of this work is a comprehensive virtual and rapid prototyping. From an industrial perspective, this would reduce the risk, time and costs associated with the development of an optical system

    Doctor of Philosophy

    Get PDF
    dissertationMost neighboring stars are still detected as point sources and are beyond the angular resolution reach of current observatories. Methods to improve our understanding of stars at high angular resolution are investigated. Air Cherenkov telescopes (ACTs), primarily used for Gamma-ray astronomy, enable us to increase our understanding of the circumstellar environment of a particular system. When used as optical intensity interferometers, future ACT arrays will allow us to detect stars as extended objects and image their surfaces at high angular resolution. ACTs are used in gamma-ray astronomy to investigate violent phenomena in the universe. However, this technique can also be used for stellar astrophysics on some isolated sources. Such is the case with the X-ray binary LS I +61◦303 which was detected in the TeV range. A gamma-ray attenuation model is developed and applied to this system. This models allows us to place constraints on fundamental properties of the system. However, a much better understanding of this system, and more so of nearby bright stellar systems, could be obtained with high angular resolution techniques. Optical stellar intensity interferometry (SII) with ACT arrays, composed of nearly 100 telescopes, will provide means to measure fundamental stellar parameters and also open the possibility of model-independent imaging. A data analysis algorithm is developed and permits the reconstruction of high angular resolution images from simulated SII data. The capabilities and limitations of future ACT arrays used for high angular resolution imaging are investigated via Monte-Carlo simulations. Simple stellar objects as well as stellar surfaces with localized hot or cool regions can be accurately imaged. Finally, experimental efforts to measure intensity correlations are expounded. The functionality of analog and digital correlators is demonstrated. Intensity correlations have been measured for a simulated star emitting pseudo-thermal light, resulting in angular diameter measurements. The StarBase observatory, consisting of a pair of 3m telescopes separated by 23m, is described

    Treatise on Hearing: The Temporal Auditory Imaging Theory Inspired by Optics and Communication

    Full text link
    A new theory of mammalian hearing is presented, which accounts for the auditory image in the midbrain (inferior colliculus) of objects in the acoustical environment of the listener. It is shown that the ear is a temporal imaging system that comprises three transformations of the envelope functions: cochlear group-delay dispersion, cochlear time lensing, and neural group-delay dispersion. These elements are analogous to the optical transformations in vision of diffraction between the object and the eye, spatial lensing by the lens, and second diffraction between the lens and the retina. Unlike the eye, it is established that the human auditory system is naturally defocused, so that coherent stimuli do not react to the defocus, whereas completely incoherent stimuli are impacted by it and may be blurred by design. It is argued that the auditory system can use this differential focusing to enhance or degrade the images of real-world acoustical objects that are partially coherent. The theory is founded on coherence and temporal imaging theories that were adopted from optics. In addition to the imaging transformations, the corresponding inverse-domain modulation transfer functions are derived and interpreted with consideration to the nonuniform neural sampling operation of the auditory nerve. These ideas are used to rigorously initiate the concepts of sharpness and blur in auditory imaging, auditory aberrations, and auditory depth of field. In parallel, ideas from communication theory are used to show that the organ of Corti functions as a multichannel phase-locked loop (PLL) that constitutes the point of entry for auditory phase locking and hence conserves the signal coherence. It provides an anchor for a dual coherent and noncoherent auditory detection in the auditory brain that culminates in auditory accommodation. Implications on hearing impairments are discussed as well.Comment: 603 pages, 131 figures, 13 tables, 1570 reference
    corecore