20,526 research outputs found

    Truly Subquadratic-Time Extension Queries and Periodicity Detection in Strings with Uncertainties

    Get PDF
    Strings with don\u27t care symbols, also called partial words, and more general indeterminate strings are a natural representation of strings containing uncertain symbols. A considerable effort has been made to obtain efficient algorithms for pattern matching and periodicity detection in such strings. Among those, a number of algorithms have been proposed that behave well on random data, but still their worst-case running time is Theta(n^2). We present the first truly subquadratic-time solutions for a number of such problems on partial words that can also be adapted to indeterminate strings over a constant-sized alphabet. We show that nn longest common compatible prefix queries (which correspond to longest common extension queries in regular strings) can be answered on-line in O(n * sqrt(n * log(n)) time after O(n * sqrt(n * log(n))-time preprocessing. We also present O(n * sqrt(n * log(n))-time algorithms for computing the prefix array and two types of border array of a partial word

    Deciding Equivalence of Linear Tree-to-Word Transducers in Polynomial Time

    Get PDF
    We show that the equivalence of deterministic linear top-down tree-to-word transducers is decidable in polynomial time. Linear tree-to-word transducers are non-copying but not necessarily order-preserving and can be used to express XML and other document transformations. The result is based on a partial normal form that provides a basic characterization of the languages produced by linear tree-to-word transducers.Comment: short version of this paper will be published in the proceedings of the 20th Conference on Developments in Language Theory (DLT 2016), Montreal, Canad

    Adaptive Investment Strategies For Periodic Environments

    Full text link
    In this paper, we present an adaptive investment strategy for environments with periodic returns on investment. In our approach, we consider an investment model where the agent decides at every time step the proportion of wealth to invest in a risky asset, keeping the rest of the budget in a risk-free asset. Every investment is evaluated in the market via a stylized return on investment function (RoI), which is modeled by a stochastic process with unknown periodicities and levels of noise. For comparison reasons, we present two reference strategies which represent the case of agents with zero-knowledge and complete-knowledge of the dynamics of the returns. We consider also an investment strategy based on technical analysis to forecast the next return by fitting a trend line to previous received returns. To account for the performance of the different strategies, we perform some computer experiments to calculate the average budget that can be obtained with them over a certain number of time steps. To assure for fair comparisons, we first tune the parameters of each strategy. Afterwards, we compare the performance of these strategies for RoIs with different periodicities and levels of noise.Comment: Paper submitted to Advances in Complex Systems (November, 2007) 22 pages, 9 figure

    Linear Compressed Pattern Matching for Polynomial Rewriting (Extended Abstract)

    Full text link
    This paper is an extended abstract of an analysis of term rewriting where the terms in the rewrite rules as well as the term to be rewritten are compressed by a singleton tree grammar (STG). This form of compression is more general than node sharing or representing terms as dags since also partial trees (contexts) can be shared in the compression. In the first part efficient but complex algorithms for detecting applicability of a rewrite rule under STG-compression are constructed and analyzed. The second part applies these results to term rewriting sequences. The main result for submatching is that finding a redex of a left-linear rule can be performed in polynomial time under STG-compression. The main implications for rewriting and (single-position or parallel) rewriting steps are: (i) under STG-compression, n rewriting steps can be performed in nondeterministic polynomial time. (ii) under STG-compression and for left-linear rewrite rules a sequence of n rewriting steps can be performed in polynomial time, and (iii) for compressed rewrite rules where the left hand sides are either DAG-compressed or ground and STG-compressed, and an STG-compressed target term, n rewriting steps can be performed in polynomial time.Comment: In Proceedings TERMGRAPH 2013, arXiv:1302.599

    Linear recurrence sequences and periodicity of multidimensional continued fractions

    Get PDF
    Multidimensional continued fractions generalize classical continued fractions with the aim of providing periodic representations of algebraic irrationalities by means of integer sequences. However, there does not exist any algorithm that provides a periodic multidimensional continued fraction when algebraic irrationalities are given as inputs. In this paper, we provide a characterization for periodicity of Jacobi--Perron algorithm by means of linear recurrence sequences. In particular, we prove that partial quotients of a multidimensional continued fraction are periodic if and only if numerators and denominators of convergents are linear recurrence sequences, generalizing similar results that hold for classical continued fractions

    Multi-dimensional Boltzmann Sampling of Languages

    Get PDF
    This paper addresses the uniform random generation of words from a context-free language (over an alphabet of size kk), while constraining every letter to a targeted frequency of occurrence. Our approach consists in a multidimensional extension of Boltzmann samplers \cite{Duchon2004}. We show that, under mostly \emph{strong-connectivity} hypotheses, our samplers return a word of size in [(1−ε)n,(1+ε)n][(1-\varepsilon)n, (1+\varepsilon)n] and exact frequency in O(n1+k/2)\mathcal{O}(n^{1+k/2}) expected time. Moreover, if we accept tolerance intervals of width in Ω(n)\Omega(\sqrt{n}) for the number of occurrences of each letters, our samplers perform an approximate-size generation of words in expected O(n)\mathcal{O}(n) time. We illustrate these techniques on the generation of Tetris tessellations with uniform statistics in the different types of tetraminoes.Comment: 12p
    • …
    corecore