421 research outputs found

    Subword balance, position indices and power sums

    Get PDF
    AbstractIn this paper, we investigate various ways of characterizing words, mainly over a binary alphabet, using information about the positions of occurrences of letters in words. We introduce two new measures associated with words, the position index and sum of position indices. We establish some characterizations, connections with Parikh matrices, and connections with power sums. One particular emphasis concerns the effect of morphisms and iterated morphisms on words

    Relations on words

    Full text link
    In the first part of this survey, we present classical notions arising in combinatorics on words: growth function of a language, complexity function of an infinite word, pattern avoidance, periodicity and uniform recurrence. Our presentation tries to set up a unified framework with respect to a given binary relation. In the second part, we mainly focus on abelian equivalence, kk-abelian equivalence, combinatorial coefficients and associated relations, Parikh matrices and MM-equivalence. In particular, some new refinements of abelian equivalence are introduced

    Joint Reconstruction of Multi-channel, Spectral CT Data via Constrained Total Nuclear Variation Minimization

    Full text link
    We explore the use of the recently proposed "total nuclear variation" (TNV) as a regularizer for reconstructing multi-channel, spectral CT images. This convex penalty is a natural extension of the total variation (TV) to vector-valued images and has the advantage of encouraging common edge locations and a shared gradient direction among image channels. We show how it can be incorporated into a general, data-constrained reconstruction framework and derive update equations based on the first-order, primal-dual algorithm of Chambolle and Pock. Early simulation studies based on the numerical XCAT phantom indicate that the inter-channel coupling introduced by the TNV leads to better preservation of image features at high levels of regularization, compared to independent, channel-by-channel TV reconstructions.Comment: Submitted to Physics in Medicine and Biolog

    A Characterization for Decidable Separability by Piecewise Testable Languages

    Full text link
    The separability problem for word languages of a class C\mathcal{C} by languages of a class S\mathcal{S} asks, for two given languages II and EE from C\mathcal{C}, whether there exists a language SS from S\mathcal{S} that includes II and excludes EE, that is, I⊆SI \subseteq S and S∩E=∅S\cap E = \emptyset. In this work, we assume some mild closure properties for C\mathcal{C} and study for which such classes separability by a piecewise testable language (PTL) is decidable. We characterize these classes in terms of decidability of (two variants of) an unboundedness problem. From this, we deduce that separability by PTL is decidable for a number of language classes, such as the context-free languages and languages of labeled vector addition systems. Furthermore, it follows that separability by PTL is decidable if and only if one can compute for any language of the class its downward closure wrt. the scattered substring ordering (i.e., if the set of scattered substrings of any language of the class is effectively regular). The obtained decidability results contrast some undecidability results. In fact, for all (non-regular) language classes that we present as examples with decidable separability, it is undecidable whether a given language is a PTL itself. Our characterization involves a result of independent interest, which states that for any kind of languages II and EE, non-separability by PTL is equivalent to the existence of common patterns in II and EE

    Low-Rank Inducing Norms with Optimality Interpretations

    Full text link
    Optimization problems with rank constraints appear in many diverse fields such as control, machine learning and image analysis. Since the rank constraint is non-convex, these problems are often approximately solved via convex relaxations. Nuclear norm regularization is the prevailing convexifying technique for dealing with these types of problem. This paper introduces a family of low-rank inducing norms and regularizers which includes the nuclear norm as a special case. A posteriori guarantees on solving an underlying rank constrained optimization problem with these convex relaxations are provided. We evaluate the performance of the low-rank inducing norms on three matrix completion problems. In all examples, the nuclear norm heuristic is outperformed by convex relaxations based on other low-rank inducing norms. For two of the problems there exist low-rank inducing norms that succeed in recovering the partially unknown matrix, while the nuclear norm fails. These low-rank inducing norms are shown to be representable as semi-definite programs. Moreover, these norms have cheaply computable proximal mappings, which makes it possible to also solve problems of large size using first-order methods

    Tree transducers, L systems, and two-way machines

    Get PDF
    A relationship between parallel rewriting systems and two-way machines is investigated. Restrictions on the “copying power” of these devices endow them with rich structuring and give insight into the issues of determinism, parallelism, and copying. Among the parallel rewriting systems considered are the top-down tree transducer; the generalized syntax-directed translation scheme and the ETOL system, and among the two-way machines are the tree-walking automaton, the two-way finite-state transducer, and (generalizations of) the one-way checking stack automaton. The. relationship of these devices to macro grammars is also considered. An effort is made .to provide a systematic survey of a number of existing results
    • …
    corecore