49 research outputs found

    Non-Stationary Texture Synthesis by Adversarial Expansion

    Full text link
    The real world exhibits an abundance of non-stationary textures. Examples include textures with large-scale structures, as well as spatially variant and inhomogeneous textures. While existing example-based texture synthesis methods can cope well with stationary textures, non-stationary textures still pose a considerable challenge, which remains unresolved. In this paper, we propose a new approach for example-based non-stationary texture synthesis. Our approach uses a generative adversarial network (GAN), trained to double the spatial extent of texture blocks extracted from a specific texture exemplar. Once trained, the fully convolutional generator is able to expand the size of the entire exemplar, as well as of any of its sub-blocks. We demonstrate that this conceptually simple approach is highly effective for capturing large-scale structures, as well as other non-stationary attributes of the input exemplar. As a result, it can cope with challenging textures, which, to our knowledge, no other existing method can handle.Comment: Accepted to SIGGRAPH 201

    SEAN: Image Synthesis with Semantic Region-Adaptive Normalization

    Full text link
    We propose semantic region-adaptive normalization (SEAN), a simple but effective building block for Generative Adversarial Networks conditioned on segmentation masks that describe the semantic regions in the desired output image. Using SEAN normalization, we can build a network architecture that can control the style of each semantic region individually, e.g., we can specify one style reference image per region. SEAN is better suited to encode, transfer, and synthesize style than the best previous method in terms of reconstruction quality, variability, and visual quality. We evaluate SEAN on multiple datasets and report better quantitative metrics (e.g. FID, PSNR) than the current state of the art. SEAN also pushes the frontier of interactive image editing. We can interactively edit images by changing segmentation masks or the style for any given region. We can also interpolate styles from two reference images per region.Comment: Accepted as a CVPR 2020 oral paper. The interactive demo is available at https://youtu.be/0Vbj9xFgoU

    Texture Mixer: A Network for Controllable Synthesis and Interpolation of Texture

    Full text link
    This paper addresses the problem of interpolating visual textures. We formulate this problem by requiring (1) by-example controllability and (2) realistic and smooth interpolation among an arbitrary number of texture samples. To solve it we propose a neural network trained simultaneously on a reconstruction task and a generation task, which can project texture examples onto a latent space where they can be linearly interpolated and projected back onto the image domain, thus ensuring both intuitive control and realistic results. We show our method outperforms a number of baselines according to a comprehensive suite of metrics as well as a user study. We further show several applications based on our technique, which include texture brush, texture dissolve, and animal hybridization.Comment: Accepted to CVPR'1

    Texture Synthesis for Surface Inspection

    Get PDF
    The automated visual surface inspection planning is an important part of the quality assurance in automated custom product manufacturing. Visual surface inspection planning tackles image acquisition design and defect detection. Both tasks greatly benefit from the utilization of realistic and automated image synthesis of the inspected object. The realism of synthesized images greatly depends on object material, whose properties are largely influenced by texture. In this work, we focus on parametric texture synthesis and its application for visual surface inspection planning. We start by analyzing texture present on physical samples and introduce the requirements for texture synthesis models in visual surface inspection. Based on observation and surface characterization standards we present a model capable of reproducing texture on physical samples. This approach is generalized and further models are presented with respect to requirements. Finally, we highlight the importance of surface texture from the visual inspection planning perspective
    corecore