97 research outputs found

    Quantum Turing automata

    Full text link
    A denotational semantics of quantum Turing machines having a quantum control is defined in the dagger compact closed category of finite dimensional Hilbert spaces. Using the Moore-Penrose generalized inverse, a new additive trace is introduced on the restriction of this category to isometries, which trace is carried over to directed quantum Turing machines as monoidal automata. The Joyal-Street-Verity Int construction is then used to extend this structure to a reversible bidirectional one.Comment: In Proceedings DCM 2012, arXiv:1403.757

    Toward a Dichotomy for Approximation of H-Coloring

    Get PDF
    Given two (di)graphs G, H and a cost function c:V(G) x V(H) -> Q_{>= 0} cup {+infty}, in the minimum cost homomorphism problem, MinHOM(H), we are interested in finding a homomorphism f:V(G)-> V(H) (a.k.a H-coloring) that minimizes sum limits_{v in V(G)}c(v,f(v)). The complexity of exact minimization of this problem is well understood [Pavol Hell and Arash Rafiey, 2012], and the class of digraphs H, for which the MinHOM(H) is polynomial time solvable is a small subset of all digraphs. In this paper, we consider the approximation of MinHOM within a constant factor. In terms of digraphs, MinHOM(H) is not approximable if H contains a digraph asteroidal triple (DAT). We take a major step toward a dichotomy classification of approximable cases. We give a dichotomy classification for approximating the MinHOM(H) when H is a graph (i.e. symmetric digraph). For digraphs, we provide constant factor approximation algorithms for two important classes of digraphs, namely bi-arc digraphs (digraphs with a conservative semi-lattice polymorphism or min-ordering), and k-arc digraphs (digraphs with an extended min-ordering). Specifically, we show that: - Dichotomy for Graphs: MinHOM(H) has a 2|V(H)|-approximation algorithm if graph H admits a conservative majority polymorphims (i.e. H is a bi-arc graph), otherwise, it is inapproximable; - MinHOM(H) has a |V(H)|^2-approximation algorithm if H is a bi-arc digraph; - MinHOM(H) has a |V(H)|^2-approximation algorithm if H is a k-arc digraph. In conclusion, we show the importance of these results and provide insights for achieving a dichotomy classification of approximable cases. Our constant factors depend on the size of H. However, the implementation of our algorithms provides a much better approximation ratio. It leaves open to investigate a classification of digraphs H, where MinHOM(H) admits a constant factor approximation algorithm that is independent of |V(H)|

    Differential cost analysis with simultaneous potentials and anti-potentials

    Get PDF
    We present a novel approach to differential cost analysis that, given a program revision, attempts to statically bound the difference in resource usage, or cost, between the two program versions. Differential cost analysis is particularly interesting because of the many compelling applications for it, such as detecting resource-use regressions at code-review time or proving the absence of certain side-channel vulnerabilities. One prior approach to differential cost analysis is to apply relational reasoning that conceptually constructs a product program on which one can over-approximate the difference in costs between the two program versions. However, a significant challenge in any relational approach is effectively aligning the program versions to get precise results. In this paper, our key insight is that we can avoid the need for and the limitations of program alignment if, instead, we bound the difference of two cost-bound summaries rather than directly bounding the concrete cost difference. In particular, our method computes a threshold value for the maximal difference in cost between two program versions simultaneously using two kinds of cost-bound summaries---a potential function that evaluates to an upper bound for the cost incurred in the first program and an anti-potential function that evaluates to a lower bound for the cost incurred in the second. Our method has a number of desirable properties: it can be fully automated, it allows optimizing the threshold value on relative cost, it is suitable for programs that are not syntactically similar, and it supports non-determinism. We have evaluated an implementation of our approach on a number of program pairs collected from the literature, and we find that our method computes tight threshold values on relative cost in most examples

    Differential cost analysis with simultaneous potentials and anti-potentials

    Get PDF
    We present a novel approach to differential cost analysis that, given a program revision, attempts to statically bound the difference in resource usage, or cost, between the two program versions. Differential cost analysis is particularly interesting because of the many compelling applications for it, such as detecting resource-use regressions at code-review time or proving the absence of certain side-channel vulnerabilities. One prior approach to differential cost analysis is to apply relational reasoning that conceptually constructs a product program on which one can over-approximate the difference in costs between the two program versions. However, a significant challenge in any relational approach is effectively aligning the program versions to get precise results. In this paper, our key insight is that we can avoid the need for and the limitations of program alignment if, instead, we bound the difference of two cost-bound summaries rather than directly bounding the concrete cost difference. In particular, our method computes a threshold value for the maximal difference in cost between two program versions simultaneously using two kinds of cost-bound summaries---a potential function that evaluates to an upper bound for the cost incurred in the first program and an anti-potential function that evaluates to a lower bound for the cost incurred in the second. Our method has a number of desirable properties: it can be fully automated, it allows optimizing the threshold value on relative cost, it is suitable for programs that are not syntactically similar, and it supports non-determinism. We have evaluated an implementation of our approach on a number of program pairs collected from the literature, and we find that our method computes tight threshold values on relative cost in most example

    Synchronizing Data Words for Register Automata

    Full text link
    Register automata (RAs) are finite automata extended with a finite set of registers to store and compare data from an infinite domain. We study the concept of synchronizing data words in RAs: does there exist a data word that sends all states of the RA to a single state? For deterministic RAs with k registers (k-DRAs), we prove that inputting data words with 2k+1 distinct data from the infinite data domain is sufficient to synchronize. We show that the synchronization problem for DRAs is in general PSPACE-complete, and it is NLOGSPACE-complete for 1-DRAs. For nondeterministic RAs (NRAs), we show that Ackermann(n) distinct data (where n is the size of the RA) might be necessary to synchronize. The synchronization problem for NRAs is in general undecidable, however, we establish Ackermann-completeness of the problem for 1-NRAs. Another main result is the NEXPTIME-completeness of the length-bounded synchronization problem for NRAs, where a bound on the length of the synchronizing data word, written in binary, is given. A variant of this last construction allows to prove that the length-bounded universality problem for NRAs is co-NEXPTIME-complete

    Absorbing Subalgebras, Cyclic Terms, and the Constraint Satisfaction Problem

    Get PDF
    The Algebraic Dichotomy Conjecture states that the Constraint Satisfaction Problem over a fixed template is solvable in polynomial time if the algebra of polymorphisms associated to the template lies in a Taylor variety, and is NP-complete otherwise. This paper provides two new characterizations of finitely generated Taylor varieties. The first characterization is using absorbing subalgebras and the second one cyclic terms. These new conditions allow us to reprove the conjecture of Bang-Jensen and Hell (proved by the authors) and the characterization of locally finite Taylor varieties using weak near-unanimity terms (proved by McKenzie and Mar\'oti) in an elementary and self-contained way

    Probabilistic call by push value

    Full text link
    We introduce a probabilistic extension of Levy's Call-By-Push-Value. This extension consists simply in adding a " flipping coin " boolean closed atomic expression. This language can be understood as a major generalization of Scott's PCF encompassing both call-by-name and call-by-value and featuring recursive (possibly lazy) data types. We interpret the language in the previously introduced denotational model of probabilistic coherence spaces, a categorical model of full classical Linear Logic, interpreting data types as coalgebras for the resource comonad. We prove adequacy and full abstraction, generalizing earlier results to a much more realistic and powerful programming language
    corecore