132 research outputs found
Deleting and Testing Forbidden Patterns in Multi-Dimensional Arrays
Understanding the local behaviour of structured multi-dimensional data is a
fundamental problem in various areas of computer science. As the amount of data
is often huge, it is desirable to obtain sublinear time algorithms, and
specifically property testers, to understand local properties of the data.
We focus on the natural local problem of testing pattern freeness: given a
large -dimensional array and a fixed -dimensional pattern over a
finite alphabet, we say that is -free if it does not contain a copy of
the forbidden pattern as a consecutive subarray. The distance of to
-freeness is the fraction of entries of that need to be modified to make
it -free. For any and any large enough pattern over
any alphabet, other than a very small set of exceptional patterns, we design a
tolerant tester that distinguishes between the case that the distance is at
least and the case that it is at most , with query
complexity and running time , where and
depend only on .
To analyze the testers we establish several combinatorial results, including
the following -dimensional modification lemma, which might be of independent
interest: for any large enough pattern over any alphabet (excluding a small
set of exceptional patterns for the binary case), and any array containing
a copy of , one can delete this copy by modifying one of its locations
without creating new -copies in .
Our results address an open question of Fischer and Newman, who asked whether
there exist efficient testers for properties related to tight substructures in
multi-dimensional structured data. They serve as a first step towards a general
understanding of local properties of multi-dimensional arrays, as any such
property can be characterized by a fixed family of forbidden patterns
Generative Adversarial Networks via a Composite Annealing of Noise and Diffusion
Generative adversarial network (GAN) is a framework for generating fake data
using a set of real examples. However, GAN is unstable in the training stage.
In order to stabilize GANs, the noise injection has been used to enlarge the
overlap of the real and fake distributions at the cost of increasing variance.
The diffusion (or smoothing) may reduce the intrinsic underlying dimensionality
of data but it suppresses the capability of GANs to learn high-frequency
information in the training procedure. Based on these observations, we propose
a data representation for the GAN training, called noisy scale-space (NSS),
that recursively applies the smoothing with a balanced noise to data in order
to replace the high-frequency information by random data, leading to a
coarse-to-fine training of GANs. We experiment with NSS using DCGAN and
StyleGAN2 based on benchmark datasets in which the NSS-based GANs outperforms
the state-of-the-arts in most cases
Time-Space Trade-offs for Triangulating a Simple Polygon
An s-workspace algorithm is an algorithm that has read-only access to the values of the input, write-only access to the output, and only uses O(s) additional words of space. We give a randomized s-workspace algorithm for triangulating a simple polygon P of n vertices, for any s up to n. The algorithm runs in O(n^2/s+n(log s)log^5(n/s)) expected time using O(s) variables, for any s up to n. In particular, the algorithm runs in O(n^2/s) expected time for most values of s
- …