2,294 research outputs found
TET-GAN: Text Effects Transfer via Stylization and Destylization
Text effects transfer technology automatically makes the text dramatically
more impressive. However, previous style transfer methods either study the
model for general style, which cannot handle the highly-structured text effects
along the glyph, or require manual design of subtle matching criteria for text
effects. In this paper, we focus on the use of the powerful representation
abilities of deep neural features for text effects transfer. For this purpose,
we propose a novel Texture Effects Transfer GAN (TET-GAN), which consists of a
stylization subnetwork and a destylization subnetwork. The key idea is to train
our network to accomplish both the objective of style transfer and style
removal, so that it can learn to disentangle and recombine the content and
style features of text effects images. To support the training of our network,
we propose a new text effects dataset with as much as 64 professionally
designed styles on 837 characters. We show that the disentangled feature
representations enable us to transfer or remove all these styles on arbitrary
glyphs using one network. Furthermore, the flexible network design empowers
TET-GAN to efficiently extend to a new text style via one-shot learning where
only one example is required. We demonstrate the superiority of the proposed
method in generating high-quality stylized text over the state-of-the-art
methods.Comment: Accepted by AAAI 2019. Code and dataset will be available at
http://www.icst.pku.edu.cn/struct/Projects/TETGAN.htm
Generating Handwritten Chinese Characters using CycleGAN
Handwriting of Chinese has long been an important skill in East Asia.
However, automatic generation of handwritten Chinese characters poses a great
challenge due to the large number of characters. Various machine learning
techniques have been used to recognize Chinese characters, but few works have
studied the handwritten Chinese character generation problem, especially with
unpaired training data. In this work, we formulate the Chinese handwritten
character generation as a problem that learns a mapping from an existing
printed font to a personalized handwritten style. We further propose DenseNet
CycleGAN to generate Chinese handwritten characters. Our method is applied not
only to commonly used Chinese characters but also to calligraphy work with
aesthetic values. Furthermore, we propose content accuracy and style
discrepancy as the evaluation metrics to assess the quality of the handwritten
characters generated. We then use our proposed metrics to evaluate the
generated characters from CASIA dataset as well as our newly introduced Lanting
calligraphy dataset.Comment: Accepted at WACV 201
Few-shot Font Generation with Localized Style Representations and Factorization
Automatic few-shot font generation is a practical and widely studied problem
because manual designs are expensive and sensitive to the expertise of
designers. Existing few-shot font generation methods aim to learn to
disentangle the style and content element from a few reference glyphs, and
mainly focus on a universal style representation for each font style. However,
such approach limits the model in representing diverse local styles, and thus
makes it unsuitable to the most complicated letter system, e.g., Chinese, whose
characters consist of a varying number of components (often called "radical")
with a highly complex structure. In this paper, we propose a novel font
generation method by learning localized styles, namely component-wise style
representations, instead of universal styles. The proposed style
representations enable us to synthesize complex local details in text designs.
However, learning component-wise styles solely from reference glyphs is
infeasible in the few-shot font generation scenario, when a target script has a
large number of components, e.g., over 200 for Chinese. To reduce the number of
reference glyphs, we simplify component-wise styles by a product of component
factor and style factor, inspired by low-rank matrix factorization. Thanks to
the combination of strong representation and a compact factorization strategy,
our method shows remarkably better few-shot font generation results (with only
8 reference glyph images) than other state-of-the-arts, without utilizing
strong locality supervision, e.g., location of each component, skeleton, or
strokes. The source code is available at https://github.com/clovaai/lffont.Comment: Accepted at AAAI 2021, 12 pages, 11 figures, the first two authors
contributed equall
- …