3,649 research outputs found

    AC-KBO Revisited

    Get PDF
    Equational theories that contain axioms expressing associativity and commutativity (AC) of certain operators are ubiquitous. Theorem proving methods in such theories rely on well-founded orders that are compatible with the AC axioms. In this paper we consider various definitions of AC-compatible Knuth-Bendix orders. The orders of Steinbach and of Korovin and Voronkov are revisited. The former is enhanced to a more powerful version, and we modify the latter to amend its lack of monotonicity on non-ground terms. We further present new complexity results. An extension reflecting the recent proposal of subterm coefficients in standard Knuth-Bendix orders is also given. The various orders are compared on problems in termination and completion.Comment: 31 pages, To appear in Theory and Practice of Logic Programming (TPLP) special issue for the 12th International Symposium on Functional and Logic Programming (FLOPS 2014

    Singularity avoidance for collapsing quantum dust in the Lemaitre-Tolman-Bondi model

    Full text link
    We investigate the fate of the classical singularity in a collapsing dust cloud. For this purpose, we quantize the marginally bound Lemaitre-Tolman-Bondi model for spherically-symmetric dust collapse by considering each dust shell in the cloud individually, taking the outermost shell as a representative. Because the dust naturally provides a preferred notion of time, we can construct a quantum mechanical model for this shell and demand unitary evolution for wave packets. It turns out that the classical singularity can generically be avoided provided the quantization ambiguities fulfill some weak conditions. We demonstrate that the collapse to a singularity is replaced by a bounce followed by an expansion. We finally construct a quantum corrected spacetime describing bouncing dust collapse and calculate the time from collapse to expansion.Comment: 20 pages, 2 figure

    Dynamic Features of Topographical Multiset Orderings for Terms

    Get PDF
    Multiset orderings are usually used to prove the termination of production systems in comparing elements directly with respect to a given precedence ordering. Topographical multiset orderings are based on the position of elements in the graph induced by the precedence. This concept results in more flexible and stronger multiset orderings. To support. the dynamic aspect of incremental refinement of a multiset ordering the notion of Depth Graphs is introduced. This concept leads to the use of a graph of which the nodes are terms [instead of constants and function symbols]. It replaces the standard precedence graph. Moreover, it can be used to define a new recursive decomposition ordering on terms which is stronger than the original one

    The computability path ordering

    Get PDF
    This paper aims at carrying out termination proofs for simply typed higher-order calculi automatically by using ordering comparisons. To this end, we introduce the computability path ordering (CPO), a recursive relation on terms obtained by lifting a precedence on function symbols. A first version, core CPO, is essentially obtained from the higher-order recursive path ordering (HORPO) by eliminating type checks from some recursive calls and by incorporating the treatment of bound variables as in the com-putability closure. The well-foundedness proof shows that core CPO captures the essence of computability arguments \'a la Tait and Girard, therefore explaining its name. We further show that no further type check can be eliminated from its recursive calls without loosing well-foundedness, but for one for which we found no counterexample yet. Two extensions of core CPO are then introduced which allow one to consider: the first, higher-order inductive types; the second, a precedence in which some function symbols are smaller than application and abstraction

    Modelling rankings in R: the PlackettLuce package

    Get PDF
    This paper presents the R package PlackettLuce, which implements a generalization of the Plackett-Luce model for rankings data. The generalization accommodates both ties (of arbitrary order) and partial rankings (complete rankings of subsets of items). By default, the implementation adds a set of pseudo-comparisons with a hypothetical item, ensuring that the underlying network of wins and losses between items is always strongly connected. In this way, the worth of each item always has a finite maximum likelihood estimate, with finite standard error. The use of pseudo-comparisons also has a regularization effect, shrinking the estimated parameters towards equal item worth. In addition to standard methods for model summary, PlackettLuce provides a method to compute quasi standard errors for the item parameters. This provides the basis for comparison intervals that do not change with the choice of identifiability constraint placed on the item parameters. Finally, the package provides a method for model-based partitioning using covariates whose values vary between rankings, enabling the identification of subgroups of judges or settings that have different item worths. The features of the package are demonstrated through application to classic and novel data sets.Comment: In v2: review of software implementing alternative models to Plackett-Luce; comparison of algorithms provided by the PlackettLuce package; further examples of rankings where the underlying win-loss network is not strongly connected. In addition, general editing to improve organisation and clarity. In v3: corrected headings Table 4, minor edit

    Set of support, demodulation, paramodulation: a historical perspective

    Get PDF
    This article is a tribute to the scientific legacy of automated reasoning pioneer and JAR founder Lawrence T. (Larry) Wos. Larry's main technical contributions were the set-of-support strategy for resolution theorem proving, and the demodulation and paramodulation inference rules for building equality into resolution. Starting from the original definitions of these concepts in Larry's papers, this survey traces their evolution, unearthing the often forgotten trails that connect Larry's original definitions to those that became standard in the field

    A complete model for welfare analysis

    Get PDF
    Taking advantage of some of the lessons learned from income inequality comparisons over time and/or across space, we provide a complete framework of analysis to compare the social or aggregate welfare of independent cross-sections of household income and non-income household characteristics.This framework serves to clarify a number of traditional issues on i) the proper domain of the social evaluation problem; ii) the need to consider alternative mean invariant inequality notions; iii) the decomposition of changes in real welfare into changes of the mean at constant prices and changes in real inequality; iv) the nature of the interhousehold welfare comparability assumptions implicit in all empirical wok, and v) the strong implications of separability assumptions necessary for inequality and welfare decomposition by population subgroups. This review essay, written with an operational aim in mind, extends and updates the treatment found, for example, in Deaton and Muellbauer (1980). The main novelty is the analysis of the simplifying implications of the condition that income adjustment procedures for taking into account non-income needs are independent of household utility levelsan assumption originally introduced in the theoretical literature by Lewbel (1989) and Blackorby and Donaldson (1989), which is extended here to the absolute case

    Constrained completion: Theory, implementation, and results

    Get PDF
    The Knuth-Bendix completion procedure produces complete sets of reductions but can not handle certain rewrite rules such as commutativity. In order to handle such theories, completion procedure were created to find complete sets of reductions modulo an equational theory. The major problem with this method is that it requires a specialized unification algorithm for the equational theory. Although this method works well when such an algorithm exists, these algorithms are not always available and thus alternative methods are needed to attack problems. A way of doing this is to use a completion procedure which finds complete sets of constrained reductions. This type of completion procedure neither requires specialized unification algorithms nor will it fail due to unorientable identities. We present a look at complete sets of reductions with constraints, developed by Gerald Peterson, and the implementation of such a completion procedure for use with HIPER - a fast completion system. The completion procedure code is given and shown correct along with the various support procedures which are needed by the constrained system. These support procedures include a procedure to find constraints using the lexicographic path ordering and a normal form procedure for constraints. The procedure has been implemented for use under the fast HIPER system, developed by Jim Christian, and thus is quick. We apply this new system, HIPER- extension, to attack a variety of word problems. Implementation alternatives are discussed, developed, and compared with each other as well as with the HIPER system. Finally, we look at the problem of finding a complete set of reductions for a ternary boolean algebra. Given are alternatives to attacking this problem and the already known solution along with its run in the HIPER-extension system --Abstract, page iii
    • …
    corecore