37,901 research outputs found
Region-controlled Style Transfer
Image style transfer is a challenging task in computational vision. Existing
algorithms transfer the color and texture of style images by controlling the
neural network's feature layers. However, they fail to control the strength of
textures in different regions of the content image. To address this issue, we
propose a training method that uses a loss function to constrain the style
intensity in different regions. This method guides the transfer strength of
style features in different regions based on the gradient relationship between
style and content images. Additionally, we introduce a novel feature fusion
method that linearly transforms content features to resemble style features
while preserving their semantic relationships. Extensive experiments have
demonstrated the effectiveness of our proposed approach
Photorealistic Style Transfer with Screened Poisson Equation
Recent work has shown impressive success in transferring painterly style to
images. These approaches, however, fall short of photorealistic style transfer.
Even when both the input and reference images are photographs, the output still
exhibits distortions reminiscent of a painting. In this paper we propose an
approach that takes as input a stylized image and makes it more photorealistic.
It relies on the Screened Poisson Equation, maintaining the fidelity of the
stylized image while constraining the gradients to those of the original input
image. Our method is fast, simple, fully automatic and shows positive progress
in making a stylized image photorealistic. Our results exhibit finer details
and are less prone to artifacts than the state-of-the-art.Comment: presented in BMVC 201
- …