77,176 research outputs found

    Counterfactuals and Explanatory Pluralism

    Get PDF
    Recent literature on non-causal explanation raises the question as to whether explanatory monism, the thesis that all explanations submit to the same analysis, is true. The leading monist proposal holds that all explanations support change-relating counterfactuals. We provide several objections to this monist position. 1Introduction2Change-Relating Monism's Three Problems3Dependency and Monism: Unhappy Together4Another Challenge: Counterfactual Incidentalism4.1High-grade necessity4.2Unity in diversity5Conclusio

    Using ACL2 to Verify Loop Pipelining in Behavioral Synthesis

    Get PDF
    Behavioral synthesis involves compiling an Electronic System-Level (ESL) design into its Register-Transfer Level (RTL) implementation. Loop pipelining is one of the most critical and complex transformations employed in behavioral synthesis. Certifying the loop pipelining algorithm is challenging because there is a huge semantic gap between the input sequential design and the output pipelined implementation making it infeasible to verify their equivalence with automated sequential equivalence checking techniques. We discuss our ongoing effort using ACL2 to certify loop pipelining transformation. The completion of the proof is work in progress. However, some of the insights developed so far may already be of value to the ACL2 community. In particular, we discuss the key invariant we formalized, which is very different from that used in most pipeline proofs. We discuss the needs for this invariant, its formalization in ACL2, and our envisioned proof using the invariant. We also discuss some trade-offs, challenges, and insights developed in course of the project.Comment: In Proceedings ACL2 2014, arXiv:1406.123

    Effective Choice and Boundedness Principles in Computable Analysis

    Full text link
    In this paper we study a new approach to classify mathematical theorems according to their computational content. Basically, we are asking the question which theorems can be continuously or computably transferred into each other? For this purpose theorems are considered via their realizers which are operations with certain input and output data. The technical tool to express continuous or computable relations between such operations is Weihrauch reducibility and the partially ordered degree structure induced by it. We have identified certain choice principles which are cornerstones among Weihrauch degrees and it turns out that certain core theorems in analysis can be classified naturally in this structure. In particular, we study theorems such as the Intermediate Value Theorem, the Baire Category Theorem, the Banach Inverse Mapping Theorem and others. We also explore how existing classifications of the Hahn-Banach Theorem and Weak K"onig's Lemma fit into this picture. We compare the results of our classification with existing classifications in constructive and reverse mathematics and we claim that in a certain sense our classification is finer and sheds some new light on the computational content of the respective theorems. We develop a number of separation techniques based on a new parallelization principle, on certain invariance properties of Weihrauch reducibility, on the Low Basis Theorem of Jockusch and Soare and based on the Baire Category Theorem. Finally, we present a number of metatheorems that allow to derive upper bounds for the classification of the Weihrauch degree of many theorems and we discuss the Brouwer Fixed Point Theorem as an example

    Classical simulation complexity of extended Clifford circuits

    Full text link
    Clifford gates are a winsome class of quantum operations combining mathematical elegance with physical significance. The Gottesman-Knill theorem asserts that Clifford computations can be classically efficiently simulated but this is true only in a suitably restricted setting. Here we consider Clifford computations with a variety of additional ingredients: (a) strong vs. weak simulation, (b) inputs being computational basis states vs. general product states, (c) adaptive vs. non-adaptive choices of gates for circuits involving intermediate measurements, (d) single line outputs vs. multi-line outputs. We consider the classical simulation complexity of all combinations of these ingredients and show that many are not classically efficiently simulatable (subject to common complexity assumptions such as P not equal to NP). Our results reveal a surprising proximity of classical to quantum computing power viz. a class of classically simulatable quantum circuits which yields universal quantum computation if extended by a purely classical additional ingredient that does not extend the class of quantum processes occurring.Comment: 17 pages, 1 figur
    • …
    corecore