2,312 research outputs found

    12th International Workshop on Termination (WST 2012) : WST 2012, February 19–23, 2012, Obergurgl, Austria / ed. by Georg Moser

    Get PDF
    This volume contains the proceedings of the 12th International Workshop on Termination (WST 2012), to be held February 19–23, 2012 in Obergurgl, Austria. The goal of the Workshop on Termination is to be a venue for presentation and discussion of all topics in and around termination. In this way, the workshop tries to bridge the gaps between different communities interested and active in research in and around termination. The 12th International Workshop on Termination in Obergurgl continues the successful workshops held in St. Andrews (1993), La Bresse (1995), Ede (1997), Dagstuhl (1999), Utrecht (2001), Valencia (2003), Aachen (2004), Seattle (2006), Paris (2007), Leipzig (2009), and Edinburgh (2010). The 12th International Workshop on Termination did welcome contributions on all aspects of termination and complexity analysis. Contributions from the imperative, constraint, functional, and logic programming communities, and papers investigating applications of complexity or termination (for example in program transformation or theorem proving) were particularly welcome. We did receive 18 submissions which all were accepted. Each paper was assigned two reviewers. In addition to these 18 contributed talks, WST 2012, hosts three invited talks by Alexander Krauss, Martin Hofmann, and Fausto Spoto

    A Combination Framework for Complexity

    Get PDF
    In this paper we present a combination framework for polynomial complexity analysis of term rewrite systems. The framework covers both derivational and runtime complexity analysis. We present generalisations of powerful complexity techniques, notably a generalisation of complexity pairs and (weak) dependency pairs. Finally, we also present a novel technique, called dependency graph decomposition, that in the dependency pair setting greatly increases modularity. We employ the framework in the automated complexity tool TCT. TCT implements a majority of the techniques found in the literature, witnessing that our framework is general enough to capture a very brought setting

    A Complexity Preserving Transformation from Jinja Bytecode to Rewrite Systems

    Full text link
    We revisit known transformations from Jinja bytecode to rewrite systems from the viewpoint of runtime complexity. Suitably generalising the constructions proposed in the literature, we define an alternative representation of Jinja bytecode (JBC) executions as "computation graphs" from which we obtain a novel representation of JBC executions as "constrained rewrite systems". We prove non-termination and complexity preservation of the transformation. We restrict to well-formed JBC programs that only make use of non-recursive methods and expect tree-shaped objects as input. Our approach allows for simplified correctness proofs and provides a framework for the combination of the computation graph method with standard techniques from static program analysis like for example "reachability analysis".Comment: 36 page

    Polynomial Path Orders: A Maximal Model

    Full text link
    This paper is concerned with the automated complexity analysis of term rewrite systems (TRSs for short) and the ramification of these in implicit computational complexity theory (ICC for short). We introduce a novel path order with multiset status, the polynomial path order POP*. Essentially relying on the principle of predicative recursion as proposed by Bellantoni and Cook, its distinct feature is the tight control of resources on compatible TRSs: The (innermost) runtime complexity of compatible TRSs is polynomially bounded. We have implemented the technique, as underpinned by our experimental evidence our approach to the automated runtime complexity analysis is not only feasible, but compared to existing methods incredibly fast. As an application in the context of ICC we provide an order-theoretic characterisation of the polytime computable functions. To be precise, the polytime computable functions are exactly the functions computable by an orthogonal constructor TRS compatible with POP*

    Polynomial Path Orders

    Full text link
    This paper is concerned with the complexity analysis of constructor term rewrite systems and its ramification in implicit computational complexity. We introduce a path order with multiset status, the polynomial path order POP*, that is applicable in two related, but distinct contexts. On the one hand POP* induces polynomial innermost runtime complexity and hence may serve as a syntactic, and fully automatable, method to analyse the innermost runtime complexity of term rewrite systems. On the other hand POP* provides an order-theoretic characterisation of the polytime computable functions: the polytime computable functions are exactly the functions computable by an orthogonal constructor TRS compatible with POP*.Comment: LMCS version. This article supersedes arXiv:1209.379

    Synthesis of sup-interpretations: a survey

    Get PDF
    In this paper, we survey the complexity of distinct methods that allow the programmer to synthesize a sup-interpretation, a function providing an upper- bound on the size of the output values computed by a program. It consists in a static space analysis tool without consideration of the time consumption. Although clearly related, sup-interpretation is independent from termination since it only provides an upper bound on the terminating computations. First, we study some undecidable properties of sup-interpretations from a theoretical point of view. Next, we fix term rewriting systems as our computational model and we show that a sup-interpretation can be obtained through the use of a well-known termination technique, the polynomial interpretations. The drawback is that such a method only applies to total functions (strongly normalizing programs). To overcome this problem we also study sup-interpretations through the notion of quasi-interpretation. Quasi-interpretations also suffer from a drawback that lies in the subterm property. This property drastically restricts the shape of the considered functions. Again we overcome this problem by introducing a new notion of interpretations mainly based on the dependency pairs method. We study the decidability and complexity of the sup-interpretation synthesis problem for all these three tools over sets of polynomials. Finally, we take benefit of some previous works on termination and runtime complexity to infer sup-interpretations.Comment: (2012

    Bayesian peak-bagging of solar-like oscillators using MCMC: A comprehensive guide

    Full text link
    Context: Asteroseismology has entered a new era with the advent of the NASA Kepler mission. Long and continuous photometric observations of unprecedented quality are now available which have stimulated the development of a number of suites of innovative analysis tools. Aims: The power spectra of solar-like oscillations are an inexhaustible source of information on stellar structure and evolution. Robust methods are hence needed in order to infer both individual oscillation mode parameters and parameters describing non-resonant features, thus making a seismic interpretation possible. Methods: We present a comprehensive guide to the implementation of a Bayesian peak-bagging tool that employs a Markov chain Monte Carlo (MCMC). Besides making it possible to incorporate relevant prior information through Bayes' theorem, this tool also allows one to obtain the marginal probability density function for each of the fitted parameters. We apply this tool to a couple of recent asteroseismic data sets, namely, to CoRoT observations of HD 49933 and to ground-based observations made during a campaign devoted to Procyon. Results: The developed method performs remarkably well at constraining not only in the traditional case of extracting oscillation frequencies, but also when pushing the limit where traditional methods have difficulties. Moreover it provides an rigorous way of comparing competing models, such as the ridge identifications, against the asteroseismic data.Comment: Accepted for publication in A&
    • 

    corecore