34 research outputs found
NeAT: Neural Artistic Tracing for Beautiful Style Transfer
Style transfer is the task of reproducing the semantic contents of a source
image in the artistic style of a second target image. In this paper, we present
NeAT, a new state-of-the art feed-forward style transfer method. We
re-formulate feed-forward style transfer as image editing, rather than image
generation, resulting in a model which improves over the state-of-the-art in
both preserving the source content and matching the target style. An important
component of our model's success is identifying and fixing "style halos", a
commonly occurring artefact across many style transfer techniques. In addition
to training and testing on standard datasets, we introduce the BBST-4M dataset,
a new, large scale, high resolution dataset of 4M images. As a component of
curating this data, we present a novel model able to classify if an image is
stylistic. We use BBST-4M to improve and measure the generalization of NeAT
across a huge variety of styles. Not only does NeAT offer state-of-the-art
quality and generalization, it is designed and trained for fast inference at
high resolution
DIFF-NST: Diffusion Interleaving For deFormable Neural Style Transfer
Neural Style Transfer (NST) is the field of study applying neural techniques
to modify the artistic appearance of a content image to match the style of a
reference style image. Traditionally, NST methods have focused on texture-based
image edits, affecting mostly low level information and keeping most image
structures the same. However, style-based deformation of the content is
desirable for some styles, especially in cases where the style is abstract or
the primary concept of the style is in its deformed rendition of some content.
With the recent introduction of diffusion models, such as Stable Diffusion, we
can access far more powerful image generation techniques, enabling new
possibilities. In our work, we propose using this new class of models to
perform style transfer while enabling deformable style transfer, an elusive
capability in previous models. We show how leveraging the priors of these
models can expose new artistic controls at inference time, and we document our
findings in exploring this new direction for the field of style transfer