88 research outputs found

    Networks of picture processors

    Get PDF
    Abstract The goal of this work is to survey in a systematic and uniform way the main results regarding different computational aspects of networks of picture processors viewed as rectangular picture accepting devices. We first consider networks with evolutionary picture processors only and discuss their computational power as well as a partial solution to the picture matching problem. Two variants of these networks, which are differentiated by the protocol of communication, are also surveyed: networks with filtered connections and networks with polarized processors. Then we consider networks having both types of processors, i.e., evolutionary processors and hiding processors, and provide a complete solution to the picture matching problem. Several results which follow from this solution are then presented. Finally we discuss some possible directions for further research

    On the Degree of Extension of Some Models Defining Non-Regular Languages

    Full text link
    This work is a survey of the main results reported for the degree of extension of two models defining non-regular languages, namely the context-free grammar and the extended automaton over groups. More precisely, we recall the main results regarding the degree on non-regularity of a context-free grammar as well as the degree of extension of finite automata over groups. Finally, we consider a similar measure for the finite automata with translucent letters and present some preliminary results. This measure could be considered for many mechanisms that extend a less expressive one.Comment: In Proceedings AFL 2023, arXiv:2309.0112

    Extended finite automata over groups

    Get PDF
    We consider a simple and natural extension of a finite automaton, namely an element of a given group is associated with each configuration. An input string is accepted if and only if the neutral element of the group is associated to a final configuration reached by the automaton. The accepting power is smaller when abelian groups are considered, in comparison with the non-abelian groups. We prove that this is due to the commutativity. Each language accepted by a finite automaton over an abelian group is actually a unordered vector language. We get a new characterization of the context-free languages as soon as the considered group is the binary free group. The result cannot be carried out in the deterministic case. Some remarks about other groups are also presented

    Fairness in grammar systems

    Get PDF

    Accepting networks of evolutionary picture processors

    Get PDF
    We extend the study of networks of evolutionary processors accepting words to a similar model, processing rectangular pictures. To this aim, we introduce accepting networks of evolutionary picture processors and investigate their computational power. We show that these networks can accept the complement of any local picture language as well as picture languages that are not recognizable. Some open problems regarding decidability issues and closure properties are finally discussed

    Accepting networks of evolutionary picture processors

    Get PDF
    We extend the study of networks of evolutionary processors accepting words to a similar model, processing rectangular pictures. To this aim, we introduce accepting networks of evolutionary picture processors and investigate their computational power. We show that these networks can accept the complement of any local picture language as well as picture languages that are not recognizable. Some open problems regarding decidability issues and closure properties are finally discussed

    Deciding Regularity of Hairpin Completions of Regular Languages in Polynomial Time

    Get PDF
    The hairpin completion is an operation on formal languages that has been inspired by the hairpin formation in DNA biochemistry and by DNA computing. In this paper we investigate the hairpin completion of regular languages. It is well known that hairpin completions of regular languages are linear context-free and not necessarily regular. As regularity of a (linear) context-free language is not decidable, the question arose whether regularity of a hairpin completion of regular languages is decidable. We prove that this problem is decidable and we provide a polynomial time algorithm. Furthermore, we prove that the hairpin completion of regular languages is an unambiguous linear context-free language and, as such, it has an effectively computable growth function. Moreover, we show that the growth of the hairpin completion is exponential if and only if the growth of the underlying languages is exponential and, in case the hairpin completion is regular, then the hairpin completion and the underlying languages have the same growth indicator

    Structuring grammar systems by priorities and hierarchies

    Get PDF
    A grammar system is a finite set of grammars that cooperate to generate a language. We consider two generalizations of grammar systems: (l) adding a priority relation between single grammar components, and (2) considering hierarchical components which by themselves are grammar systems. The generative power of these generalized grammar systems is investigated, and compared with the generative power of ordinary grammar systems and of some well-known types of grammars with regulated rewriting (such as matrix grammars). We prove that for many cooperating strategies the use of priority relation increases the generative capacity, however this is not the case for the maximal mode of derivation (an important case, because it gives a characterization of the ETOL languages). We also demonstrate that in many cases the use of hierarchical components does not increase the generative power

    Transducers based on networks of evolutionary processors LOS FINANCIADORES NO ESTÁN BIEN

    Full text link
    We consider a new type of transducer that does not scan sequentially the input word. Instead, it consists of a directed graph whose nodes are processors which work in parallel and are specialized in just one type of a very simple evolutionary operation: inserting, deleting or substituting a symbol by another one. The computation on an input word starts with this word placed in a designated node, the input node, of the network an alternates evolutionary and communication steps. The computation halts as soon as another designated node, the output node, is nonempty. The translation of the input word is the set of words existing in the output node when the computation halts. We prove that these transducers can simulate the work of generalized sequential machines on every input. Furthermore, all words obtained by a given generalized sequential machine by the shortest computations on a given word can also be computed by the new transducers. Unlike the case of generalized sequential machines, every recursively enumerable language can be the transduction de?ned by the new transducer of a very simple regular language. The same idea may be used for proving that these transducers can simulate the shortest computations of an arbitrary Turing machine, used as a transducer, on every input word. Finally, we consider a restricted variant of NEP transducer, namely pure NEP transducers and prove that there are still regular languages whose pure NEP transductions are not semilinear
    corecore