19,697 research outputs found

    Characterizations of Polynomial Complexity Classes with a Better Intensionality

    Get PDF
    ISBN : 978-1-60558-117-0International audienceIn this paper, we study characterizations of polynomial complexity classes using first order functional programs and we try to improve their intensionality, that is the number of natural algorithms captured. We use polynomial assignments over the reals. The polynomial assignments used are inspired by the notions of quasi-interpretation and sup-interpretation, and are decidable when considering polynomials of bounded degree ranging over real numbers. Contrarily to quasi-interpretations, the considered assignments are not required to have the subterm property. Consequently, they capture a strictly larger number of natural algorithms (including quotient, gcd, duplicate elimination from a list) than previous characterizations using quasi-interpretations

    Quasi-friendly sup-interpretations

    Get PDF
    In a previous paper, the sup-interpretation method was proposed as a new tool to control memory resources of first order functional programs with pattern matching by static analysis. Basically, a sup-interpretation provides an upper bound on the size of function outputs. In this former work, a criterion, which can be applied to terminating as well as non-terminating programs, was developed in order to bound polynomially the stack frame size. In this paper, we suggest a new criterion which captures more algorithms computing values polynomially bounded in the size of the inputs. Since this work is related to quasi-interpretations, we compare the two notions obtaining two main features. The first one is that, given a program, we have heuristics for finding a sup-interpretation when we consider polynomials of bounded degree. The other one consists in the characterizations of the set of function computable in polynomial time and in polynomial space

    12th International Workshop on Termination (WST 2012) : WST 2012, February 19–23, 2012, Obergurgl, Austria / ed. by Georg Moser

    Get PDF
    This volume contains the proceedings of the 12th International Workshop on Termination (WST 2012), to be held February 19–23, 2012 in Obergurgl, Austria. The goal of the Workshop on Termination is to be a venue for presentation and discussion of all topics in and around termination. In this way, the workshop tries to bridge the gaps between different communities interested and active in research in and around termination. The 12th International Workshop on Termination in Obergurgl continues the successful workshops held in St. Andrews (1993), La Bresse (1995), Ede (1997), Dagstuhl (1999), Utrecht (2001), Valencia (2003), Aachen (2004), Seattle (2006), Paris (2007), Leipzig (2009), and Edinburgh (2010). The 12th International Workshop on Termination did welcome contributions on all aspects of termination and complexity analysis. Contributions from the imperative, constraint, functional, and logic programming communities, and papers investigating applications of complexity or termination (for example in program transformation or theorem proving) were particularly welcome. We did receive 18 submissions which all were accepted. Each paper was assigned two reviewers. In addition to these 18 contributed talks, WST 2012, hosts three invited talks by Alexander Krauss, Martin Hofmann, and Fausto Spoto

    Synthesis of sup-interpretations: a survey

    Get PDF
    In this paper, we survey the complexity of distinct methods that allow the programmer to synthesize a sup-interpretation, a function providing an upper- bound on the size of the output values computed by a program. It consists in a static space analysis tool without consideration of the time consumption. Although clearly related, sup-interpretation is independent from termination since it only provides an upper bound on the terminating computations. First, we study some undecidable properties of sup-interpretations from a theoretical point of view. Next, we fix term rewriting systems as our computational model and we show that a sup-interpretation can be obtained through the use of a well-known termination technique, the polynomial interpretations. The drawback is that such a method only applies to total functions (strongly normalizing programs). To overcome this problem we also study sup-interpretations through the notion of quasi-interpretation. Quasi-interpretations also suffer from a drawback that lies in the subterm property. This property drastically restricts the shape of the considered functions. Again we overcome this problem by introducing a new notion of interpretations mainly based on the dependency pairs method. We study the decidability and complexity of the sup-interpretation synthesis problem for all these three tools over sets of polynomials. Finally, we take benefit of some previous works on termination and runtime complexity to infer sup-interpretations.Comment: (2012

    Lower Complexity Bounds for Lifted Inference

    Full text link
    One of the big challenges in the development of probabilistic relational (or probabilistic logical) modeling and learning frameworks is the design of inference techniques that operate on the level of the abstract model representation language, rather than on the level of ground, propositional instances of the model. Numerous approaches for such "lifted inference" techniques have been proposed. While it has been demonstrated that these techniques will lead to significantly more efficient inference on some specific models, there are only very recent and still quite restricted results that show the feasibility of lifted inference on certain syntactically defined classes of models. Lower complexity bounds that imply some limitations for the feasibility of lifted inference on more expressive model classes were established early on in (Jaeger 2000). However, it is not immediate that these results also apply to the type of modeling languages that currently receive the most attention, i.e., weighted, quantifier-free formulas. In this paper we extend these earlier results, and show that under the assumption that NETIME =/= ETIME, there is no polynomial lifted inference algorithm for knowledge bases of weighted, quantifier- and function-free formulas. Further strengthening earlier results, this is also shown to hold for approximate inference, and for knowledge bases not containing the equality predicate.Comment: To appear in Theory and Practice of Logic Programming (TPLP

    Reasoning about Minimal Belief and Negation as Failure

    Full text link
    We investigate the problem of reasoning in the propositional fragment of MBNF, the logic of minimal belief and negation as failure introduced by Lifschitz, which can be considered as a unifying framework for several nonmonotonic formalisms, including default logic, autoepistemic logic, circumscription, epistemic queries, and logic programming. We characterize the complexity and provide algorithms for reasoning in propositional MBNF. In particular, we show that entailment in propositional MBNF lies at the third level of the polynomial hierarchy, hence it is harder than reasoning in all the above mentioned propositional formalisms for nonmonotonic reasoning. We also prove the exact correspondence between negation as failure in MBNF and negative introspection in Moore's autoepistemic logic

    Intensional properties of polygraphs

    Get PDF
    We present polygraphic programs, a subclass of Albert Burroni's polygraphs, as a computational model, showing how these objects can be seen as first-order functional programs. We prove that the model is Turing complete. We use polygraphic interpretations, a termination proof method introduced by the second author, to characterize polygraphic programs that compute in polynomial time. We conclude with a characterization of polynomial time functions and non-deterministic polynomial time functions.Comment: Proceedings of TERMGRAPH 2007, Electronic Notes in Computer Science (to appear), 12 pages, minor changes from previous versio

    Resource control of object-oriented programs

    Get PDF
    A sup-interpretation is a tool which provides an upper bound on the size of a value computed by some symbol of a program. Sup-interpretations have shown their interest to deal with the complexity of first order functional programs. For instance, they allow to characterize all the functions bitwise computable in Alogtime. This paper is an attempt to adapt the framework of sup-interpretations to a fragment of oriented-object programs, including distinct encodings of numbers through the use of constructor symbols, loop and while constructs and non recursive methods with side effects. We give a criterion, called brotherly criterion, which ensures that each brotherly program computes objects whose size is polynomially bounded by the inputs sizes
    • 

    corecore