150,836 research outputs found

    An approach to the synthesis of biological tissue

    Get PDF
    Mathematical phantoms developed to synthesize realistic complex backgrounds such as those obtained when imaging biological tissue, play a key role in the quantitative assessment of image quality for medical and biomedical imaging. We present a modeling framework for the synthesis of realistic tissue samples. The technique is demonstrated using radiological breast tissue. The model employs a two-component image decomposition consisting of a slowly, spatially varying mean-background and a residual texture image. Each component is synthesized independently. The approach and results presented here constitute an important step towards developing methods for the quantitative assessment of image quality in medical and biomedical imaging, and more generally image science

    High-Resolution Image Synthesis and Semantic Manipulation with Conditional GANs

    Full text link
    We present a new method for synthesizing high-resolution photo-realistic images from semantic label maps using conditional generative adversarial networks (conditional GANs). Conditional GANs have enabled a variety of applications, but the results are often limited to low-resolution and still far from realistic. In this work, we generate 2048x1024 visually appealing results with a novel adversarial loss, as well as new multi-scale generator and discriminator architectures. Furthermore, we extend our framework to interactive visual manipulation with two additional features. First, we incorporate object instance segmentation information, which enables object manipulations such as removing/adding objects and changing the object category. Second, we propose a method to generate diverse results given the same input, allowing users to edit the object appearance interactively. Human opinion studies demonstrate that our method significantly outperforms existing methods, advancing both the quality and the resolution of deep image synthesis and editing.Comment: v2: CVPR camera ready, adding more results for edge-to-photo example

    Photometric image-based rendering for virtual lighting image synthesis

    Get PDF
    A concept named Photometric Image-Based Rendering (PIBR) is introduced for a seamless augmented reality. The PIBR is defined as image-based rendering which covers appearance changes caused by the lighting condition changes, while Geometric Image-Based Rendering (GIBR) is defined as image-based rendering which covers appearance changes caused by the view point changes. PIBR can be applied to image synthesis to keep photometric consistency between virtual objects and real scenes in arbitrary lighting conditions. We analyze conventional IBR algorithms, and formalize PIBR within the whole IBR framework. A specific algorithm is also presented for realizing PIBR. The photometric linearization makes a controllable framework for PIBR, which consists of four processes: (1) separation of environmental illumination effects, (2) estimation of lighting directions, (3) separation of specular reflections and cast-shadows, and (4) linearization of self-shadows. After the-photometric linearization of input images, we can synthesize any realistic images which include not-only diffuse reflections but also self-shadows, cast-shadows and specular reflections. Experimental results show that realistic images can be successfully synthesized while keeping photometric consistency</p
    • …
    corecore