243,415 research outputs found

    Knowledge Compilation of Logic Programs Using Approximation Fixpoint Theory

    Full text link
    To appear in Theory and Practice of Logic Programming (TPLP), Proceedings of ICLP 2015 Recent advances in knowledge compilation introduced techniques to compile \emph{positive} logic programs into propositional logic, essentially exploiting the constructive nature of the least fixpoint computation. This approach has several advantages over existing approaches: it maintains logical equivalence, does not require (expensive) loop-breaking preprocessing or the introduction of auxiliary variables, and significantly outperforms existing algorithms. Unfortunately, this technique is limited to \emph{negation-free} programs. In this paper, we show how to extend it to general logic programs under the well-founded semantics. We develop our work in approximation fixpoint theory, an algebraical framework that unifies semantics of different logics. As such, our algebraical results are also applicable to autoepistemic logic, default logic and abstract dialectical frameworks

    Is there a digital archivist in the room? : the preservation of musique mixte

    Get PDF
    Over the past decade, the field of musique mixte/mixed music preservation grew separate from the digital preservation research (and practice) community. This paper presents potential preservation directions for this repertoire, based on current preservation frameworks, in relation to collaboration at the bit-level, logical-level, and conceptual-level of preservation. It emphasizes the link to practice in a postdigital (i.e. after the ‘digital revolution’) preservation and curation perspective, institutional policies, the implication and redefinition of all stakeholders, and the use of documentation as mediation

    Optimized Compilation of Aggregated Instructions for Realistic Quantum Computers

    Full text link
    Recent developments in engineering and algorithms have made real-world applications in quantum computing possible in the near future. Existing quantum programming languages and compilers use a quantum assembly language composed of 1- and 2-qubit (quantum bit) gates. Quantum compiler frameworks translate this quantum assembly to electric signals (called control pulses) that implement the specified computation on specific physical devices. However, there is a mismatch between the operations defined by the 1- and 2-qubit logical ISA and their underlying physical implementation, so the current practice of directly translating logical instructions into control pulses results in inefficient, high-latency programs. To address this inefficiency, we propose a universal quantum compilation methodology that aggregates multiple logical operations into larger units that manipulate up to 10 qubits at a time. Our methodology then optimizes these aggregates by (1) finding commutative intermediate operations that result in more efficient schedules and (2) creating custom control pulses optimized for the aggregate (instead of individual 1- and 2-qubit operations). Compared to the standard gate-based compilation, the proposed approach realizes a deeper vertical integration of high-level quantum software and low-level, physical quantum hardware. We evaluate our approach on important near-term quantum applications on simulations of superconducting quantum architectures. Our proposed approach provides a mean speedup of 5×5\times, with a maximum of 10×10\times. Because latency directly affects the feasibility of quantum computation, our results not only improve performance but also have the potential to enable quantum computation sooner than otherwise possible.Comment: 13 pages, to apper in ASPLO

    Logical calculi for reasoning with binding

    Get PDF
    In informal mathematical usage we often reason about languages involving binding of object-variables. We find ourselves writing assertions involving meta-variables and capture-avoidance constraints on where object-variables can and cannot occur free. Formalising such assertions is problematic because the standard logical frameworks cannot express capture-avoidance constraints directly. In this thesis we make the case for extending logical frameworks with metavariables and capture-avoidance constraints. We use nominal techniques that allow for a direct formalisation of meta-level assertions, while remaining close to informal practice. Our focus is on derivability and we show that our derivation rules support the following key features of meta-level reasoning: • instantiation of meta-variables, by means of capturing substitution of terms for meta-variables; • ??-renaming of object-variables and capture-avoiding substitution of terms for object-variables in the presence of meta-variables; • generation of fresh object-variables inside a derivation. We apply our nominal techniques to the following two logical frameworks: • Equational logic. We investigate proof-theoretical properties, give a semantics in nominal sets and compare the notion of ??-renaming to existing notions of ??-equivalence with meta-variables. We also provide an axiomatisation of capture-avoiding substitution, and show that it is sound and complete with respect to the usual notion of capture-avoiding substitution. • First-order logic with equality. We provide a sequent calculus with metavariables and capture-avoidance constraints, and show that it represents schemas of derivations in first-order logic. We also show how we can axiomatise this notion of derivability in the calculus for equational logic

    Lifted Variable Elimination for Probabilistic Logic Programming

    Full text link
    Lifted inference has been proposed for various probabilistic logical frameworks in order to compute the probability of queries in a time that depends on the size of the domains of the random variables rather than the number of instances. Even if various authors have underlined its importance for probabilistic logic programming (PLP), lifted inference has been applied up to now only to relational languages outside of logic programming. In this paper we adapt Generalized Counting First Order Variable Elimination (GC-FOVE) to the problem of computing the probability of queries to probabilistic logic programs under the distribution semantics. In particular, we extend the Prolog Factor Language (PFL) to include two new types of factors that are needed for representing ProbLog programs. These factors take into account the existing causal independence relationships among random variables and are managed by the extension to variable elimination proposed by Zhang and Poole for dealing with convergent variables and heterogeneous factors. Two new operators are added to GC-FOVE for treating heterogeneous factors. The resulting algorithm, called LP2^2 for Lifted Probabilistic Logic Programming, has been implemented by modifying the PFL implementation of GC-FOVE and tested on three benchmarks for lifted inference. A comparison with PITA and ProbLog2 shows the potential of the approach.Comment: To appear in Theory and Practice of Logic Programming (TPLP). arXiv admin note: text overlap with arXiv:1402.0565 by other author

    Lower Complexity Bounds for Lifted Inference

    Full text link
    One of the big challenges in the development of probabilistic relational (or probabilistic logical) modeling and learning frameworks is the design of inference techniques that operate on the level of the abstract model representation language, rather than on the level of ground, propositional instances of the model. Numerous approaches for such "lifted inference" techniques have been proposed. While it has been demonstrated that these techniques will lead to significantly more efficient inference on some specific models, there are only very recent and still quite restricted results that show the feasibility of lifted inference on certain syntactically defined classes of models. Lower complexity bounds that imply some limitations for the feasibility of lifted inference on more expressive model classes were established early on in (Jaeger 2000). However, it is not immediate that these results also apply to the type of modeling languages that currently receive the most attention, i.e., weighted, quantifier-free formulas. In this paper we extend these earlier results, and show that under the assumption that NETIME =/= ETIME, there is no polynomial lifted inference algorithm for knowledge bases of weighted, quantifier- and function-free formulas. Further strengthening earlier results, this is also shown to hold for approximate inference, and for knowledge bases not containing the equality predicate.Comment: To appear in Theory and Practice of Logic Programming (TPLP

    Unpacking the logic of mathematical statements

    Get PDF
    This study focuses on undergraduate students' ability to unpack informally written mathematical statements into the language of predicate calculus. Data were collected between 1989 and 1993 from 61students in six small sections of a “bridge" course designed to introduce proofs and mathematical reasoning. We discuss this data from a perspective that extends the notion of concept image to that of statement image and introduces the notion of proof framework to indicate the top-level logical structure of a proof. For simplified informal calculus statements, just 8.5% of unpacking attempts were successful; for actual statements from calculus texts, this dropped to 5%. We infer that these students would be unable to reliably relate informally stated theorems with the top-level logical structure of their proofs and hence could not be expected to construct proofs or evaluate their validity

    From Professional Business Partner to Strategic Talent Leader : What’s Next for Human Resource Management

    Get PDF
    The HR profession is at a critical inflection point. It can evolve into a true decision science of talent, and aspire to the level of influence of disciplines such as Finance and Marketing, or it can continue the traditional focus on support services and program delivery to organizational clients. In this paper, we suggest that the transition to a decision science is essential and not only feasible, but historically predictable. However, we show that making the transition is not a function of achieving best-practice professional practices. Rather, it requires developing a logical, deep and coherent framework linking organizational talent to strategic success. We show how the evolution of the decision sciences of Finance and Marketing, out of the professional practices of Accounting and Sales, provide the principles to guide the evolution from the current professional practice of HR, to the emerging decision science of talentship

    A Case Study on Logical Relations using Contextual Types

    Full text link
    Proofs by logical relations play a key role to establish rich properties such as normalization or contextual equivalence. They are also challenging to mechanize. In this paper, we describe the completeness proof of algorithmic equality for simply typed lambda-terms by Crary where we reason about logically equivalent terms in the proof environment Beluga. There are three key aspects we rely upon: 1) we encode lambda-terms together with their operational semantics and algorithmic equality using higher-order abstract syntax 2) we directly encode the corresponding logical equivalence of well-typed lambda-terms using recursive types and higher-order functions 3) we exploit Beluga's support for contexts and the equational theory of simultaneous substitutions. This leads to a direct and compact mechanization, demonstrating Beluga's strength at formalizing logical relations proofs.Comment: In Proceedings LFMTP 2015, arXiv:1507.0759
    • …
    corecore