1,445 research outputs found

    Compression, Modeling, and Real-Time Rendering of Realistic Materials and Objects

    Get PDF
    The realism of a scene basically depends on the quality of the geometry, the illumination and the materials that are used. Whereas many sources for the creation of three-dimensional geometry exist and numerous algorithms for the approximation of global illumination were presented, the acquisition and rendering of realistic materials remains a challenging problem. Realistic materials are very important in computer graphics, because they describe the reflectance properties of surfaces, which are based on the interaction of light and matter. In the real world, an enormous diversity of materials can be found, comprising very different properties. One important objective in computer graphics is to understand these processes, to formalize them and to finally simulate them. For this purpose various analytical models do already exist, but their parameterization remains difficult as the number of parameters is usually very high. Also, they fail for very complex materials that occur in the real world. Measured materials, on the other hand, are prone to long acquisition time and to huge input data size. Although very efficient statistical compression algorithms were presented, most of them do not allow for editability, such as altering the diffuse color or mesostructure. In this thesis, a material representation is introduced that makes it possible to edit these features. This makes it possible to re-use the acquisition results in order to easily and quickly create deviations of the original material. These deviations may be subtle, but also substantial, allowing for a wide spectrum of material appearances. The approach presented in this thesis is not based on compression, but on a decomposition of the surface into several materials with different reflection properties. Based on a microfacette model, the light-matter interaction is represented by a function that can be stored in an ordinary two-dimensional texture. Additionally, depth information, local rotations, and the diffuse color are stored in these textures. As a result of the decomposition, some of the original information is inevitably lost, therefore an algorithm for the efficient simulation of subsurface scattering is presented as well. Another contribution of this work is a novel perception-based simplification metric that includes the material of an object. This metric comprises features of the human visual system, for example trichromatic color perception or reduced resolution. The proposed metric allows for a more aggressive simplification in regions where geometric metrics do not simplif

    Design And Assessment Of Compact Optical Systems Towards Special Effects Imaging

    Get PDF
    A main challenge in the field of special effects is to create special effects in real time in a way that the user can preview the effect before taking the actual picture or movie sequence. There are many techniques currently used to create computer-simulated special effects, however current techniques in computer graphics do not provide the option for the creation of real-time texture synthesis. Thus, while computer graphics is a powerful tool in the field of special effects, it is neither portable nor does it provide work in real-time capabilities. Real-time special effects may, however, be created optically. Such approach will provide not only real-time image processing at the speed of light but also a preview option allowing the user or the artist to preview the effect on various parts of the object in order to optimize the outcome. The work presented in this dissertation was inspired by the idea of optically created special effects, such as painterly effects, encoded in images captured by photographic or motion picture cameras. As part of the presented work, compact relay optics was assessed, developed, and a working prototype was built. It was concluded that even though compact relay optics can be achieved, further push for compactness and cost-effectiveness was impossible in the paradigm of bulk macro-optics systems. Thus, a paradigm for imaging with multi-aperture micro-optics was proposed and demonstrated for the first time, which constitutes one of the key contributions of this work. This new paradigm was further extended to the most general case of magnifying multi-aperture micro-optical systems. Such paradigm allows an extreme reduction in size of the imaging optics by a factor of about 10 and a reduction in weight by a factor of about 500. Furthermore, an experimental quantification of the feasibility of optically created special effects was completed, and consequently raytracing software was developed, which was later commercialized by SmARTLens(TM). While the art forms created via raytracing were powerful, they did not predict all effects acquired experimentally. Thus, finally, as key contribution of this work, the principles of scalar diffraction theory were applied to optical imaging of extended objects under quasi-monochromatic incoherent illumination in order to provide a path to more accurately model the proposed optical imaging process for special effects obtained in the hardware. The existing theoretical framework was generalized to non-paraxial in- and out-of-focus imaging and results were obtained to verify the generalized framework. In the generalized non-paraxial framework, even the most complex linear systems, without any assumptions for shift invariance, can be modeled and analyzed

    Intensity Mapping Functions For HDR Panorama Imaging: Weighted Histogram Averaging

    Full text link
    It is challenging to stitch multiple images with different exposures due to possible color distortion and loss of details in the brightest and darkest regions of input images. In this paper, a novel intensity mapping algorithm is first proposed by introducing a new concept of weighted histogram averaging (WHA). The proposed WHA algorithm leverages the correspondence between the histogram bins of two images which are built up by using the non-decreasing property of the intensity mapping functions (IMFs). The WHA algorithm is then adopted to synthesize a set of differently exposed panorama images. The intermediate panorama images are finally fused via a state-of-the-art multi-scale exposure fusion (MEF) algorithm to produce the final panorama image. Extensive experiments indicate that the proposed WHA algorithm significantly surpasses the related state-of-the-art intensity mapping methods. The proposed high dynamic range (HDR) stitching algorithm also preserves details in the brightest and darkest regions of the input images well. The related materials will be publicly accessible at https://github.com/yilun-xu/WHA for reproducible research.Comment: 11 pages, 5 figure

    Design of a Miniature Camera System for Interior Vision Automotive Application

    Get PDF
    The purpose of this thesis is to describe the design process, goals, and analysis of the interior vision camera for a driver monitoring system. The design includes minimizing the overall footprint of the system by utilizing smaller more precise optics, as well as higher quantum efficiency (QE) image sensor technologies and packaging. As a result of this research, prototype cameras are constructed, and performance was analyzed. The analysis shows that Modulation Transfer Function (MTF) performance is stable at extreme hot and cold temperatures, while the cost is mitigated by using all plastic lens elements. New high QE image sensors are a potential improvement to this design. The mechanical part of the design has resulted in the filing of three different patents. The first patent was the athermalization spacer itself for automotive applications. The second patent was the way the lens barrel interacts with the athermalization piece. The third patent was the way the imager assembly accommodates the same Bill Of Material (BOM) components and different customer requirement angles

    A robust patch-based synthesis framework for combining inconsistent images

    Get PDF
    Current methods for combining different images produce visible artifacts when the sources have very different textures and structures, come from far view points, or capture dynamic scenes with motions. In this thesis, we propose a patch-based synthesis algorithm to plausibly combine different images that have color, texture, structural, and geometric inconsistencies. For some applications such as cloning and stitching where a gradual blend is required, we present a new method for synthesizing a transition region between two source images, such that inconsistent properties change gradually from one source to the other. We call this process image melding. For gradual blending, we generalized patch-based optimization foundation with three key generalizations: First, we enrich the patch search space with additional geometric and photometric transformations. Second, we integrate image gradients into the patch representation and replace the usual color averaging with a screened Poisson equation solver. Third, we propose a new energy based on mixed L2/L0 norms for colors and gradients that produces a gradual transition between sources without sacrificing texture sharpness. Together, all three generalizations enable patch-based solutions to a broad class of image melding problems involving inconsistent sources: object cloning, stitching challenging panoramas, hole filling from multiple photos, and image harmonization. We also demonstrate another application which requires us to address inconsistencies across the images: high dynamic range (HDR) reconstruction using sequential exposures. In this application, the results will suffer from objectionable artifacts for dynamic scenes if the inconsistencies caused by significant scene motions are not handled properly. In this thesis, we propose a new approach to HDR reconstruction that uses information in all exposures while being more robust to motion than previous techniques. Our algorithm is based on a novel patch-based energy-minimization formulation that integrates alignment and reconstruction in a joint optimization through an equation we call the HDR image synthesis equation. This allows us to produce an HDR result that is aligned to one of the exposures yet contains information from all of them. These two applications (image melding and high dynamic range reconstruction) show that patch based methods like the one proposed in this dissertation can address inconsistent images and could open the door to many new image editing applications in the future

    YDA görüntü gölgeleme gidermede gelişmişlik seviyesi ve YDA görüntüler için nesnel bir gölgeleme giderme kalite metriği.

    Get PDF
    Despite the emergence of new HDR acquisition methods, the multiple exposure technique (MET) is still the most popular one. The application of MET on dynamic scenes is a challenging task due to the diversity of motion patterns and uncontrollable factors such as sensor noise, scene occlusion and performance concerns on some platforms with limited computational capability. Currently, there are already more than 50 deghosting algorithms proposed for artifact-free HDR imaging of dynamic scenes and it is expected that this number will grow in the future. Due to the large number of algorithms, it is a difficult and time-consuming task to conduct subjective experiments for benchmarking recently proposed algorithms. In this thesis, first, a taxonomy of HDR deghosting methods and the key characteristics of each group of algorithms are introduced. Next, the potential artifacts which are observed frequently in the outputs of HDR deghosting algorithms are defined and an objective HDR image deghosting quality metric is presented. It is found that the proposed metric is well correlated with the human preferences and it may be used as a reference for benchmarking current and future HDR image deghosting algorithmsPh.D. - Doctoral Progra

    Cardiovascular magnetic resonance artefacts

    Get PDF
    The multitude of applications offered by CMR make it an increasing popular modality to study the heart and the surrounding vessels. Nevertheless the anatomical complexity of the chest, together with cardiac and respiratory motion, and the fast flowing blood, present many challenges which can possibly translate into imaging artefacts. The literature is wide in terms of papers describing specific MR artefacts in great technical detail. In this review we attempt to summarise, in a language accessible to a clinical readership, some of the most common artefacts found in CMR applications. It begins with an introduction of the most common pulse sequences, and imaging techniques, followed by a brief section on typical cardiovascular applications. This leads to the main section on common CMR artefacts with examples, a short description of the mechanisms behind them, and possible solutions

    MAGNETIC RESONANCE ELASTOGRAPHY FOR APPLICATIONS IN RADIATION THERAPY

    Get PDF
    Magnetic resonance elastography (MRE) is an imaging technique that combines mechanical waves and magnetic resonance imaging (MRI) to determine the elastic properties of tissue. Because MRE is non-invasive, there is great potential and interest for its use in the detection of cancer. The first part of this thesis concentrates on parameter optimization and imaging quality of an MRE system. To do this, we developed a customized quality assurance phantom, and a series of quality control tests to characterize the MRE system. Our results demonstrated that through optimizing scan parameters, such as frequency and amplitude, MRE could provide a good qualitative elastogram for targets with different elasticity values and dimensions. The second part investigated the feasibility of integrating MRE into radiation therapy (RT) workflow. With the aid of a tissue-equivalent prostate phantom (embedded with three dominant intraprostatic lesions (DILs)), an MRE-integrated RT framework was developed. This framework contains a comprehensive scan protocol including Computed Tomography (CT) scan, combined MRI/MRE scans and a Volumetric Modulated Arc Therapy (VMAT) technique for treatment delivery. The results showed that using the comprehensive information could boost the MRE defined DILs to 84 Gy while keeping the remainder of the prostate to 78 Gy. Using a VMAT based technique allowed us to achieve a highly conformal plan (conformity index for the prostate and combined DILs was 0.98 and 0.91). Based on our feasibility study, we concluded that MRE data can be used for targeted radiation dose escalation. In summary, this thesis demonstrates that MRE is feasible for applications in radiation oncology
    • …
    corecore