5 research outputs found
Maximum entropy methods for texture synthesis: theory and practice
Recent years have seen the rise of convolutional neural network techniques in
exemplar-based image synthesis. These methods often rely on the minimization of
some variational formulation on the image space for which the minimizers are
assumed to be the solutions of the synthesis problem. In this paper we
investigate, both theoretically and experimentally, another framework to deal
with this problem using an alternate sampling/minimization scheme. First, we
use results from information geometry to assess that our method yields a
probability measure which has maximum entropy under some constraints in
expectation. Then, we turn to the analysis of our method and we show, using
recent results from the Markov chain literature, that its error can be
explicitly bounded with constants which depend polynomially in the dimension
even in the non-convex setting. This includes the case where the constraints
are defined via a differentiable neural network. Finally, we present an
extensive experimental study of the model, including a comparison with
state-of-the-art methods and an extension to style transfer
Macrocanonical Models for Texture Synthesis
In this article we consider macrocanonical models for texture synthesis. In these models samples are generated given an input texture image and a set of features which should be matched in expectation. It is known that if the images are quantized, macrocanonical models are given by Gibbs measures, using the maximum entropy principle. We study conditions under which this result extends to real-valued images. If these conditions hold, finding a macrocanonical model amounts to minimizing a convex function and sampling from an associated Gibbs measure. We analyze an algorithm which alternates between sampling and minimizing. We present experiments with neural network features and study the drawbacks and advantages of using this sampling scheme