15 research outputs found

    Polynomial Languages with Finite Antidictionaries

    Get PDF
    We tackle the problem of studying which kind of functions can occur as complexity functions of formal languages of a certain type. We prove that an important narrow subclass of rational languages contains languages of polynomial complexity of any integer degree over any non-trivial alphabet. © 2008 EDP Sciences.The author is grateful to O. Karyakina for the idea of the web-like automaton. Special thanks to J. Karhum¨aki and to the referee for valuable remarks on the paper

    On intermediate Factorial Languages

    Full text link
    We prove that factorial languages defined over non-trivial finite alphabets under some natural conditions have intermediate complexity functions, i.e., the number of words in such a language grows faster than any polynomial but slower than any exponential function. © 2008 Elsevier B.V. All rights reserved.The author is grateful to J. Karhumäki for valuable remarks on the paper. The author was supported by the Federal Science and Innovation Agency of Russia under the grants RI-111. 0/002/075 and 2227.2003.01, by the Russian Foundation for Basic Research under the grant 05-01-00540, and by the Federal Education Agency of Russia under the grant 49123

    On intermediate Factorial Languages

    Get PDF
    We prove that factorial languages defined over non-trivial finite alphabets under some natural conditions have intermediate complexity functions, i.e., the number of words in such a language grows faster than any polynomial but slower than any exponential function. © 2008 Elsevier B.V. All rights reserved.The author is grateful to J. Karhumäki for valuable remarks on the paper. The author was supported by the Federal Science and Innovation Agency of Russia under the grants RI-111. 0/002/075 and 2227.2003.01, by the Russian Foundation for Basic Research under the grant 05-01-00540, and by the Federal Education Agency of Russia under the grant 49123

    Growth Rates of Complexity of Power-Free Languages

    Get PDF
    We present a new fast algorithm for calculating the growth rate of complexity for regular languages. Using this algorithm we develop a space and time efficient method to approximate growth rates of complexity of arbitrary power-free languages over finite alphabets. Through extensive computer-assisted studies we sufficiently improve all known upper bounds for growth rates of such languages, obtain a lot of new bounds and discover some general regularities. © 2010 Elsevier B.V

    Growth Rates of Complexity of Power-Free Languages

    Full text link
    We present a new fast algorithm for calculating the growth rate of complexity for regular languages. Using this algorithm we develop a space and time efficient method to approximate growth rates of complexity of arbitrary power-free languages over finite alphabets. Through extensive computer-assisted studies we sufficiently improve all known upper bounds for growth rates of such languages, obtain a lot of new bounds and discover some general regularities. © 2010 Elsevier B.V

    On the Growth Rates of Complexity of Threshold Languages

    Get PDF
    Threshold languages, which are the (k/(k-1))+-free languages over k-letter alphabets with k ≥, are the minimal infinite power-free languages according to Dejean's conjecture, which is now proved for all alphabets. We study the growth properties of these languages. On the base of obtained structural properties and computer-assisted studies we conjecture that the growth rate of complexity of the threshold language over k letters tends to a constant α̌ ≈ 1.242 as k tends to infinity. © 2010 EDP Sciences.The authors heartly thank the referees for their valuable comments and remarks

    Comparing Complexity Functions of a Language and Its Extendable Part

    Get PDF
    Right (left, two-sided) extendable part of a language consists of all words having infinitely many right (resp. left, two-sided) extensions within the language. We prove that for an arbitrary factorial language each of these parts has the same growth rate of complexity as the language itself. On the other hand, we exhibit a factorial language which grows superpolynomially, while its two-sided extendable part grows only linearly. © 2008 EDP Sciences

    Presentations of constrained systems with unconstrained positions

    Get PDF
    International audienceWe give a polynomial-time construction of the set of sequences that satisfy a finite-memory constraint defined by a finite list of forbidden blocks, with a specified set of bit positions unconstrained. Such a construction can be used to build modulation/error-correction codes (ECC codes) like the ones defined by the Immink-Wijngaarden scheme in which certain bit positions are reserved for ECC parity. We give a lineartime construction of a finite-state presentation of a constrained system defined by a periodic list of forbidden blocks. These systems, called periodic-finite-type systems, were introduced by Moision and Siegel. Finally, we present a linear-time algorithm for constructing the minimal periodic forbidden blocks of a finite sequence for a given period

    Substring Complexity in Sublinear Space

    Get PDF
    Shannon’s entropy is a definitive lower bound for statistical compression. Unfortunately, no such clear measure exists for the compressibility of repetitive strings. Thus, ad hoc measures are employed to estimate the repetitiveness of strings, e.g., the size z of the Lempel–Ziv parse or the number r of equal-letter runs of the Burrows-Wheeler transform. A more recent one is the size γ of a smallest string attractor. Let T be a string of length n. A string attractor of T is a set of positions of T capturing the occurrences of all the substrings of T. Unfortunately, Kempa and Prezza [STOC 2018] showed that computing γ is NP-hard. Kociumaka et al. [LATIN 2020] considered a new measure of compressibility that is based on the function S_T(k) counting the number of distinct substrings of length k of T, also known as the substring complexity of T. This new measure is defined as δ = sup{S_T(k)/k, k ≥ 1} and lower bounds all the relevant ad hoc measures previously considered. In particular, δ ≤ γ always holds and δ can be computed in O(n) time using Θ(n) working space. Kociumaka et al. showed that one can construct an O(δ log n/(δ))-sized representation of T supporting efficient direct access and efficient pattern matching queries on T. Given that for highly compressible strings, δ is significantly smaller than n, it is natural to pose the following question: Can we compute δ efficiently using sublinear working space? It is straightforward to show that in the comparison model, any algorithm computing δ using O(b) space requires Ω(n^{2-o(1)}/b) time through a reduction from the element distinctness problem [Yao, SIAM J. Comput. 1994]. We thus wanted to investigate whether we can indeed match this lower bound. We address this algorithmic challenge by showing the following bounds to compute δ: - O((n3log b)/b2) time using O(b) space, for any b ∈ [1,n], in the comparison model. - Õ(n2/b) time using Õ(b) space, for any b ∈ [√n,n], in the word RAM model. This gives an Õ(n^{1+ε})-time and Õ(n^{1-ε})-space algorithm to compute δ, for any 0 < ε ≤ 1/2. Let us remark that our algorithms compute S_T(k), for all k, within the same complexities
    corecore