16,693 research outputs found
Deleting and Testing Forbidden Patterns in Multi-Dimensional Arrays
Understanding the local behaviour of structured multi-dimensional data is a
fundamental problem in various areas of computer science. As the amount of data
is often huge, it is desirable to obtain sublinear time algorithms, and
specifically property testers, to understand local properties of the data.
We focus on the natural local problem of testing pattern freeness: given a
large -dimensional array and a fixed -dimensional pattern over a
finite alphabet, we say that is -free if it does not contain a copy of
the forbidden pattern as a consecutive subarray. The distance of to
-freeness is the fraction of entries of that need to be modified to make
it -free. For any and any large enough pattern over
any alphabet, other than a very small set of exceptional patterns, we design a
tolerant tester that distinguishes between the case that the distance is at
least and the case that it is at most , with query
complexity and running time , where and
depend only on .
To analyze the testers we establish several combinatorial results, including
the following -dimensional modification lemma, which might be of independent
interest: for any large enough pattern over any alphabet (excluding a small
set of exceptional patterns for the binary case), and any array containing
a copy of , one can delete this copy by modifying one of its locations
without creating new -copies in .
Our results address an open question of Fischer and Newman, who asked whether
there exist efficient testers for properties related to tight substructures in
multi-dimensional structured data. They serve as a first step towards a general
understanding of local properties of multi-dimensional arrays, as any such
property can be characterized by a fixed family of forbidden patterns
Can fusion coefficients be calculated from the depth rule ?
The depth rule is a level truncation of tensor product coefficients expected
to be sufficient for the evaluation of fusion coefficients. We reformulate the
depth rule in a precise way, and show how, in principle, it can be used to
calculate fusion coefficients. However, we argue that the computation of the
depth itself, in terms of which the constraints on tensor product coefficients
is formulated, is problematic. Indeed, the elements of the basis of states
convenient for calculating tensor product coefficients do not have a
well-defined depth! We proceed by showing how one can calculate the depth in an
`approximate' way and derive accurate lower bounds for the minimum level at
which a coupling appears. It turns out that this method yields exact results
for and constitutes an efficient and simple algorithm for
computing fusion coefficients.Comment: 27 page
Space-efficient detection of unusual words
Detecting all the strings that occur in a text more frequently or less
frequently than expected according to an IID or a Markov model is a basic
problem in string mining, yet current algorithms are based on data structures
that are either space-inefficient or incur large slowdowns, and current
implementations cannot scale to genomes or metagenomes in practice. In this
paper we engineer an algorithm based on the suffix tree of a string to use just
a small data structure built on the Burrows-Wheeler transform, and a stack of
bits, where is the length of the string and
is the size of the alphabet. The size of the stack is except for very
large values of . We further improve the algorithm by removing its time
dependency on , by reporting only a subset of the maximal repeats and
of the minimal rare words of the string, and by detecting and scoring candidate
under-represented strings that in the string. Our
algorithms are practical and work directly on the BWT, thus they can be
immediately applied to a number of existing datasets that are available in this
form, returning this string mining problem to a manageable scale.Comment: arXiv admin note: text overlap with arXiv:1502.0637
Nonlinear optics and light localization in periodic photonic lattices
We review the recent developments in the field of photonic lattices
emphasizing their unique properties for controlling linear and nonlinear
propagation of light. We draw some important links between optical lattices and
photonic crystals pointing towards practical applications in optical
communications and computing, beam shaping, and bio-sensing.Comment: to appear in Journal of Nonlinear Optical Physics & Materials (JNOPM
Generating constrained random graphs using multiple edge switches
The generation of random graphs using edge swaps provides a reliable method
to draw uniformly random samples of sets of graphs respecting some simple
constraints, e.g. degree distributions. However, in general, it is not
necessarily possible to access all graphs obeying some given con- straints
through a classical switching procedure calling on pairs of edges. We therefore
propose to get round this issue by generalizing this classical approach through
the use of higher-order edge switches. This method, which we denote by "k-edge
switching", makes it possible to progres- sively improve the covered portion of
a set of constrained graphs, thereby providing an increasing, asymptotically
certain confidence on the statistical representativeness of the obtained
sample.Comment: 15 page
- …