In the past decade, exemplar-based texture synthesis algorithms have seen
strong gains in performance by matching statistics of deep convolutional neural
networks. However, these algorithms require regularization terms or user-added
spatial tags to capture long range constraints in images. Having access to a
user-added spatial tag for all situations is not always feasible, and
regularization terms can be difficult to tune. It would be ideal to create an
algorithm that does not have any of the aforementioned drawbacks. Thus, we
propose a new set of statistics for exemplar based texture synthesis based on
Sliced Wasserstein Loss and create a multi-scale algorithm to synthesize
textures without a user-added spatial tag. Lastly, we study the ability of our
proposed algorithm to capture long range constraints in images and compare our
results to other exemplar-based neural texture synthesis algorithms.Comment: Submitted to IEEE for possible publicatio