1,530 research outputs found

    Automata with Nested Pebbles Capture First-Order Logic with Transitive Closure

    Get PDF
    String languages recognizable in (deterministic) log-space are characterized either by two-way (deterministic) multi-head automata, or following Immerman, by first-order logic with (deterministic) transitive closure. Here we elaborate this result, and match the number of heads to the arity of the transitive closure. More precisely, first-order logic with k-ary deterministic transitive closure has the same power as deterministic automata walking on their input with k heads, additionally using a finite set of nested pebbles. This result is valid for strings, ordered trees, and in general for families of graphs having a fixed automaton that can be used to traverse the nodes of each of the graphs in the family. Other examples of such families are grids, toruses, and rectangular mazes. For nondeterministic automata, the logic is restricted to positive occurrences of transitive closure. The special case of k=1 for trees, shows that single-head deterministic tree-walking automata with nested pebbles are characterized by first-order logic with unary deterministic transitive closure. This refines our earlier result that placed these automata between first-order and monadic second-order logic on trees.Comment: Paper for Logical Methods in Computer Science, 27 pages, 1 figur

    Logic programming in the context of multiparadigm programming: the Oz experience

    Full text link
    Oz is a multiparadigm language that supports logic programming as one of its major paradigms. A multiparadigm language is designed to support different programming paradigms (logic, functional, constraint, object-oriented, sequential, concurrent, etc.) with equal ease. This article has two goals: to give a tutorial of logic programming in Oz and to show how logic programming fits naturally into the wider context of multiparadigm programming. Our experience shows that there are two classes of problems, which we call algorithmic and search problems, for which logic programming can help formulate practical solutions. Algorithmic problems have known efficient algorithms. Search problems do not have known efficient algorithms but can be solved with search. The Oz support for logic programming targets these two problem classes specifically, using the concepts needed for each. This is in contrast to the Prolog approach, which targets both classes with one set of concepts, which results in less than optimal support for each class. To explain the essential difference between algorithmic and search programs, we define the Oz execution model. This model subsumes both concurrent logic programming (committed-choice-style) and search-based logic programming (Prolog-style). Instead of Horn clause syntax, Oz has a simple, fully compositional, higher-order syntax that accommodates the abilities of the language. We conclude with lessons learned from this work, a brief history of Oz, and many entry points into the Oz literature.Comment: 48 pages, to appear in the journal "Theory and Practice of Logic Programming

    Distributed Recovery in Applicative Systems

    Get PDF
    Applicative systems are promising candidates for achieving high performance computing through aggregation of processors. This paper studies the fault recovery problems in a class of applicative systems. The concept of functional checkpointing is proposed as the nucleus of a distributed recovery mechanism. This entails incrementally building a resilient structure as the evaluation of an applicative program proceeds. A simple rollback algorithm is suggested to regenerate the corrupted structure by redoing the most effective functional checkpoints. Another algorithm, which attempts to recover intermediate results, is also presented. The parent of a faulty task reproduces a functional twin of the failed task. The regenerated task inherits all offspring of the faulty task so that partial results can be salvaged

    Pattern matching in compilers

    Get PDF
    In this thesis we develop tools for effective and flexible pattern matching. We introduce a new pattern matching system called amethyst. Amethyst is not only a generator of parsers of programming languages, but can also serve as an alternative to tools for matching regular expressions. Our framework also produces dynamic parsers. Its intended use is in the context of IDE (accurate syntax highlighting and error detection on the fly). Amethyst offers pattern matching of general data structures. This makes it a useful tool for implementing compiler optimizations such as constant folding, instruction scheduling, and dataflow analysis in general. The parsers produced are essentially top-down parsers. Linear time complexity is obtained by introducing the novel notion of structured grammars and regularized regular expressions. Amethyst uses techniques known from compiler optimizations to produce effective parsers.Comment: master thesi

    Towards a Parallel Topological Watershed:First Results

    Get PDF

    Computer assisted audiometric evaluation system

    Get PDF
    A computer-based audiometric evaluation system has been developed. The system makes use of an IBM PC/XT/AT compatible personal computer to perform pure tone and speech tests and · comprises a plug-in card and custom software. The card contains pure tone and masking noise generators, together with amplifiers for a. set of headphones .and bone conduction transducer, patient and audiologist microphone amplifiers and a hand-held infra-red remote-control unit. A voice-operated gain-adjusting device on the audiologist's microphone eliminates the need for a sound pressure level meter during speech tests. The software-based user-interface makes use.of overlaid pop-up menus, context sensitive assistance.and a text editor on a graphics screen. Pure tone and speech data are acquired and displayed on a dynamic audiogram and speech discrimination gram respectively. This data may be stored and later retrieved from a patient data base. Further audiometric tests may be incorporated at a later stage

    A survey on algorithmic aspects of modular decomposition

    Full text link
    The modular decomposition is a technique that applies but is not restricted to graphs. The notion of module naturally appears in the proofs of many graph theoretical theorems. Computing the modular decomposition tree is an important preprocessing step to solve a large number of combinatorial optimization problems. Since the first polynomial time algorithm in the early 70's, the algorithmic of the modular decomposition has known an important development. This paper survey the ideas and techniques that arose from this line of research

    Data structures

    Get PDF
    We discuss data structures and their methods of analysis. In particular, we treat the unweighted and weighted dictionary problem, self-organizing data structures, persistent data structures, the union-find-split problem, priority queues, the nearest common ancestor problem, the selection and merging problem, and dynamization techniques. The methods of analysis are worst, average and amortized case
    corecore